Skip to content
Snippets Groups Projects
This project is mirrored from https://github.com/metabase/metabase. Pull mirroring updated .
  1. Sep 27, 2023
  2. Sep 26, 2023
    • adam-james's avatar
      Alert API sends all alert information to any user. Change this to onl… (#34067) · 0920254f
      adam-james authored
      * Alert API sends all alert information to any user. Change this to only show those pulses the user has created or is
      recipient of
      
      * Adjust tests to show that only admins see all alerts. Regular users only see their alerts.
      
      * Change docstring to describe behaviour of the endpoint.
      Unverified
      0920254f
    • Tim Macdonald's avatar
    • Cam Saul's avatar
      Remove deprecated `current-db-time` method (#32686) · 47a52b57
      Cam Saul authored
      
      * Remove deprecated current-db-time method
      
      * Address PR feedback
      
      * sync-timezone! should handle java.time.ZoneId and java.time.ZoneOffset
      
      * Allow time zone offsets as zone IDs
      
      * Fix tests and suppress clj-kondo on a few forms
      
      * Suppress warning on one more occurrence of db-default-timezone
      
      * Map offset Z to time zone UTC when syncing
      
      * Accept zone offsets, zone ids and strings for h2 and oracle
      
      * Mark snowflake as not supporting report timezone
      
      * Revert "Mark snowflake as not supporting report timezone"
      
      This reverts commit 70b15a60334fda707d695bc6c01621961f1c5aa8.
      
      * Fix report-timezone handling
      
      * Numeric timezone
      
      ---------
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      Co-authored-by: default avatarTamás Benkő <tamas@metabase.com>
      Unverified
      47a52b57
    • lbrdnk's avatar
    • Nick Fitzpatrick's avatar
      Only run BE tests on static viz FE changes (#33760) · 39acfd4a
      Nick Fitzpatrick authored
      * output static viz sources, update be workflow and file paths
      
      * build static viz before file check
      
      * extending file check timeout
      
      * fixing mistake
      
      * disable optimization when generating file paths
      
      * prefer offline
      
      * moving workflows to save 89505
      
      * Maybe caching the build?
      
      * removing minification
      
      * upload artifact instead of use cache
      
      * Add workflows back
      
      * reduce files changed timeout
      
      * removing unneeded yarn install
      Unverified
      39acfd4a
    • dpsutton's avatar
      X ray stage 1 (#34026) · 400850b9
      dpsutton authored
      * reintroduce the functions
      
      * using affinities
      
      had to update the affinities map to handle multiple definitions of the
      same affinity name.
      
      ```clojure
      core=> (let [affinities (-> ["table" "GenericTable"]
                                  dashboard-templates/get-dashboard-template
                                  dash-template->affinities-map)]
               (affinities "HourOfDayCreateDate"))
      [["HourOfDayCreateDate"
        {:dimensions ["CreateTimestamp"], :metrics ["Count"], :score 50}]
       ["HourOfDayCreateDate"
        {:dimensions ["CreateTime"], :metrics ["Count"], :score 50}]]
      ```
      
      Here there were two cards defined as "HourOfDateCreateDate". One looked
      for a createtimestamp, the other for a createtime. When treating the
      affinities as unique names, the one with createtime clobbered the
      createtimestamp and then it went unsatisfied.
      
      Now we keep both definitions around. The group by is stable so the
      timestamp comes first and in match that one will be matched.
      
      * i don't like reduced for some reason
      
      * fixup affinities
      
      1. Affinities were in wrong shape:
      
      before:
      
      ```clojure
      {"card-name" [["card-name" definition]
                    ["card-name" definition]]
       ,,,
      ```
      
      new shape:
      
      ```clojure
      {"card-name" [definition
                    definition]
       ,,,
      ```
      
      ex:
      
      ```clojure
      core=> (let [affinities (-> ["table" "GenericTable"]
                                  dashboard-templates/get-dashboard-template
                                  dash-template->affinities-map)]
               (affinities "HourOfDayCreateDate"))
      ({:dimensions ["CreateTimestamp"], :metrics ["Count"], :score 50}
       {:dimensions ["CreateTime"], :metrics ["Count"], :score 50})
      ```
      
      2. Erroring on unfindable filters/metrics
      using `{:filter [:dimension ::unsatisfiable]}` didn't work correctly
      because of this:
      
      ```clojure
      ;; it doesn't look for dimensions by keyword
      core=> (dashboard-templates/collect-dimensions [:dimension ::nope])
      ()
      core=> (dashboard-templates/collect-dimensions [:dimension (str ::nope)])
      (":metabase.automagic-dashboards.core/nope")
      ```
      
      * Updating dash-template->affinities-map to dash-template->affinities.
      
      Two key changes were made:
      - The card name is now a part of the 'affinity' object (under the `:affinity-name` key).
        This flattening should also make affinities easier to deal with and more flexible
        (e.g. arbitrary groupings).
      - All base dimensions are exploded out into these affinity objects under the `:base-dims` key.
        Note that this may result in growth of the number of affinities when named items are
        repeated with different definitions. For example, card names are not unique, resulting in
        N affinities per card. Also, metrics and filters need not be unique. GenericTable, for
        example, has 6 definitions of the Last30Days filter. This will result in 6X the number of
        affinities created for each card using that filter.
      
      This change encapsulates affinities, as before you'd need to know what dimensions underlied
      any given metric or filter. ATM, we do not package each filter or metric definition in each
      affinity object, but perhaps that would be worth doing in the future for complete encapsulation.
      
      This change also allows for very simple matching logic. To match card affinities, for example,
      you just filter all affinities for which the affinity dimensions are a subset of the provided
      dimensions.
      
      * Updated tests for new affinity code. Removed accidental cruft.
      
      * Change shape of affinities
      
      The "satisfied-affins" are of this shape:
      ```clojure
      {"AverageIncomeByMonth" #{"Income" "Timestamp"},
       "AverageDiscountByMonth" #{"Income" "Discount" "Timestamp"}}
      ```
      
      And they are ordered so it can drive the `make-cards` function in the
      future. The idea is for right now we'll look up the card based on
      affinity name, and when multiple cards found, the set of dimensions they
      depend on. And that will drive the layout.
      
      But in the future, just the affinity itself will drive how we make a
      card layout. This is the firs step towards that.
      
      * Satisfied-affinities shape is ordered map to vector of sets of dims
      
      ```clojure
      {"RowcountLast30Days" [#{"CreateTimestamp"}],
       "Rowcount" [#{}]}
      ```
      
      Since each could be met in a few definitions in multiple ways
      
      * Update comment
      
      ```clojure
      core=> (let [affinities (-> ["table" "GenericTable"]
                                  dashboard-templates/get-dashboard-template
                                  dash-template->affinities)]
               (match-affinities affinities {:available-dimensions {"JoinDate" :whatever}}))
      {"Rowcount" [#{}],
       "RowcountLast30Days" [#{"JoinDate"}],
       "CountByJoinDate" [#{"JoinDate"}],
       "DayOfWeekJoinDate" [#{"JoinDate"}],
       "DayOfMonthJoinDate" [#{"JoinDate"}],
       "MonthOfYearJoinDate" [#{"JoinDate"}],
       "QuerterOfYearJoinDate" [#{"JoinDate"}]}
      ```
      
      * comment block is helpful :octopus:
      
      
      
      * Drive `make-cards` from affinities
      
      old style was make-cards looped over all cards to see which ones were
      satisfied. Now we've taken a notion of "interestingness" which we call
      affinities: which dimensions are interesting in combination with each
      other. Right now these are derived from card combinations but that will
      change going forward.
      
      So now going into the make-cards loop are interesting combinations and
      we then grab a card-template from the combination. Again, it's a double
      lookup back to cards but this lets us break that cycle and come up with
      interesting card templates just based on the groupings themselves.
      
      in the future, we're going to want an affinity to produce multiple
      card-templates so this will become a mapcat of some sorts rather than a
      map.
      
      * Removing pre-check from card-candidates and corresponding unit test as this is not an invariant -- card-candidates should always be satisfied with our affinity mapping.
      
      * comment and docstring
      
      * Changed names for clarity (affinity -> affinity-sets) and modified
      the return value of match-affinities to have values of sets of sets
      rather than vectors of sets. This makes matching simpler and easier.
      
      Added a docstring to CardTemplateProvider and started adding some tests.
      
      * Revert "Changed names for clarity (affinity -> affinity-sets) and modified"
      
      This reverts commit dd2aef1fea8e6deb5f970e51f698e4b72fa97b32.
      
      * Something about either the cherry-pick or stale state made the previous
      change of affinities (as a vector of set) to affinity-sets (a set of sets)
      cause failures. It may just be that the implementation was broken and
      the tests passed due to a stale state ¯\_(ツ)_/¯.
      
      This picks out the clarity and doc changes and reverts the set of sets.
      
      * Adding schemas for the affinity functions in metabase.automagic-dashboards.core
      
      * Created the AffinitySetProvider protocol, which, given an item, will
      provide the set of affinities (a set of set of dimensions) required to
      bind to that item. The initial implementation reifies the protocol
      over a dashboard template and provides affinity sets for cards, but
      this protocol could be extended to provide affinities for whatever
      object we desire.
      
      The initial implementation looks like so:
      
      ```clojure
      (p/defprotocol+ AffinitySetProvider
        "For some item, determine the affinity sets of that item. This is a set of sets, each underlying set being a set of
        dimensions that, if satisfied, specify affinity to the item."
        (create-affinity-sets [this item]))
      
      (mu/defn base-dimension-provider :- [:fn #(satisfies? AffinitySetProvider %)]
        "Takes a dashboard template and produces a function that takes a dashcard template and returns a seq of potential
        dimension sets that can satisfy the card."
        [{card-metrics :metrics card-filters :filters} :- ads/dashboard-template]
        (let [dim-groups (fn [items]
                           (-> (->> items
                                    (map
                                      (fn [item]
                                        [(ffirst item)
                                         (set (dashboard-templates/collect-dimensions item))]))
                                    (group-by first))
                               (update-vals (fn [v] (mapv second v)))
                               (update "this" conj #{})))
              m->dims    (dim-groups card-metrics)
              f->dims    (dim-groups card-filters)]
          (reify AffinitySetProvider
            (create-affinity-sets [_ {:keys [dimensions metrics filters]}]
              (let [dimset                (set (map ffirst dimensions))
                    underlying-dim-groups (concat (map m->dims metrics) (map f->dims filters))]
                (set
                  (map
                    (fn [lower-dims] (reduce into dimset lower-dims))
                    (apply math.combo/cartesian-product underlying-dim-groups))))))))
      ```
      
      * Adding specs for dashcard, dashcards, context (minimal) and instrumenting
      make-cards. I'm wondering if we should move card-candidates into the
      layout-producer protocol since we're using known affinities to make a thing,
      not to make a baby step to make a thing. The one gotcha in this is that
      there's a positional index inserted in there which IDK how we use other
      than maybe layout ATM.
      
      * Added more schemas and externalized the matching of affinities to
      potential dimensions (or any map conforming to the a map of dimension
      names to matches of items). This generalizes our ability to match
      of affinity groups with "things" that we want to generate.
      
      * Renaming output of all-satisfied-bindings to satisfied-bindings to prevent variable shadowing error.
      
      ---------
      
      Co-authored-by: default avatarMark Bastian <markbastian@gmail.com>
      Unverified
      400850b9
    • Cal Herries's avatar
    • Tim Macdonald's avatar
      Use Bigints for CSV uploads (#34118) · ecf9d549
      Tim Macdonald authored
      [Fixes #34080]
      Unverified
      ecf9d549
    • Nicolò Pretto's avatar
      Revert "show columns added after the dashcard has been created in the viz... · 7e0eb68f
      Nicolò Pretto authored
      Revert "show columns added after the dashcard has been created in the viz settings (#33886)" (#34116)
      
      Unverified
      7e0eb68f
    • Ngoc Khuat's avatar
      Fix the flaky test 'downgrade-dashboard-tabs-test' (#34115) · 3a1b10c6
      Ngoc Khuat authored
      * Fix the flaky test 'downgrade-dashboard-tabs-test' by bumping the
      migration version we upgrade to during the test. I'm not fully sure why
      it was flaky in the first place. For some reasons the rollback step
      doesn't get triggered on MySQL even though liqubiase does acknowledge
      that the migration got ran?? eitherway bumping the version seems to take
      the flaky a way.
      Unverified
      3a1b10c6
    • Cal Herries's avatar
    • Mahatthana (Kelvin) Nomsawadi's avatar
  3. Sep 25, 2023
  4. Sep 22, 2023
    • Nemanja Glumac's avatar
      Do not offer to delete archived collections (#33991) · 14d4665c
      Nemanja Glumac authored
      * Reproduce #33996
      Unverified
      14d4665c
    • Case Nelson's avatar
      [MLv2] Add query can-run function (#34040) · 20988e95
      Case Nelson authored
      
      * [MLv2] Add query can-run function
      
      * Update src/metabase/lib/native.cljc
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      
      * Add test for non-native query
      
      * Fix tests - snippet-id can be optional
      
      ---------
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      Unverified
      20988e95
    • dpsutton's avatar
      ~~Bump h2~~ inline parameter to fix flaky test (#34062) · fa764262
      dpsutton authored
      * Bump h2 to fix flaky test
      
      Symptom:
      --------
      
      error in linter job:
      ```
      With premium token features = #{"audit-app"}
      :metabase-enterprise.audit-app.pages.dashboards/most-popular-with-avg-speed
      
      expected: (schema= {:status (s/eq :completed), s/Keyword s/Any} (qp/process-query query))
        actual: clojure.lang.ExceptionInfo: Error reducing result rows: Error running audit query: Invalid value "NULL" for parameter "result FETCH"; SQL statement:
      WITH "MOST_POPULAR" AS (SELECT "D"."ID" AS "DASHBOARD_ID", "D"."NAME" AS "DASHBOARD_NAME", COUNT(*) AS "VIEWS" FROM "VIEW_LOG" AS "VL" LEFT JOIN "REPORT_DASHBOARD" AS "D" ON "VL"."MODEL_ID" = "D"."ID" WHERE "VL"."MODEL" = 'dashboard' GROUP BY "D"."ID" ORDER BY COUNT(*) DESC LIMIT ?), "CARD_RUNNING_TIME" AS (SELECT "QE"."CARD_ID", AVG("QE"."RUNNING_TIME") AS "AVG_RUNNING_TIME" FROM "QUERY_EXECUTION" AS "QE" WHERE "QE"."CARD_ID" IS NOT NULL GROUP BY "QE"."CARD_ID"), "DASH_AVG_RUNNING_TIME" AS (SELECT "D"."ID" AS "DASHBOARD_ID", AVG("RT"."AVG_RUNNING_TIME") AS "AVG_RUNNING_TIME" FROM "REPORT_DASHBOARDCARD" AS "DC" LEFT JOIN "CARD_RUNNING_TIME" AS "RT" ON "DC"."CARD_ID" = "RT"."CARD_ID" LEFT JOIN "REPORT_DASHBOARD" AS "D" ON "DC"."DASHBOARD_ID" = "D"."ID" WHERE "D"."ID" IN (SELECT "DASHBOARD_ID" FROM "MOST_POPULAR") GROUP BY "D"."ID") SELECT "MP"."DASHBOARD_ID", "MP"."DASHBOARD_NAME", "MP"."VIEWS", "RT"."AVG_RUNNING_TIME" FROM "MOST_POPULAR" AS "MP" LEFT JOIN "DASH_AVG_RUNNING_TIME" AS "RT" ON "MP"."DASHBOARD_ID" = "RT"."DASHBOARD_ID" ORDER BY "MP"."VIEWS" DESC LIMIT ? [90008-214]
      
      <stack traces of us catching and rethrowing>
      Caused by: org.h2.jdbc.JdbcSQLDataException: Invalid value "NULL" for parameter "result FETCH"; SQL statement:
      WITH "MOST_POPULAR" AS (SELECT "D"."ID" AS "DASHBOARD_ID", "D"."NAME" AS "DASHBOARD_NAME", COUNT(*) AS "VIEWS" FROM "VIEW_LOG" AS "VL" LEFT JOIN "REPORT_DASHBOARD" AS "D" ON "VL"."MODEL_ID" = "D"."ID" WHERE "VL"."MODEL" = 'dashboard' GROUP BY "D"."ID" ORDER BY COUNT(*) DESC LIMIT ?), "CARD_RUNNING_TIME" AS (SELECT "QE"."CARD_ID", AVG("QE"."RUNNING_TIME") AS "AVG_RUNNING_TIME" FROM "QUERY_EXECUTION" AS "QE" WHERE "QE"."CARD_ID" IS NOT NULL GROUP BY "QE"."CARD_ID"), "DASH_AVG_RUNNING_TIME" AS (SELECT "D"."ID" AS "DASHBOARD_ID", AVG("RT"."AVG_RUNNING_TIME") AS "AVG_RUNNING_TIME" FROM "REPORT_DASHBOARDCARD" AS "DC" LEFT JOIN "CARD_RUNNING_TIME" AS "RT" ON "DC"."CARD_ID" = "RT"."CARD_ID" LEFT JOIN "REPORT_DASHBOARD" AS "D" ON "DC"."DASHBOARD_ID" = "D"."ID" WHERE "D"."ID" IN (SELECT "DASHBOARD_ID" FROM "MOST_POPULAR") GROUP BY "D"."ID") SELECT "MP"."DASHBOARD_ID", "MP"."DASHBOARD_NAME", "MP"."VIEWS", "RT"."AVG_RUNNING_TIME" FROM "MOST_POPULAR" AS "MP" LEFT JOIN "DASH_AVG_RUNNING_TIME" AS "RT" ON "MP"."DASHBOARD_ID" = "RT"."DASHBOARD_ID" ORDER BY "MP"."VIEWS" DESC LIMIT ? [90008-214]
       at org.h2.message.DbException.getJdbcSQLException (DbException.java:646)
          org.h2.message.DbException.getJdbcSQLException (DbException.java:477)
          org.h2.message.DbException.get (DbException.java:223)
          org.h2.message.DbException.getInvalidValueException (DbException.java:298)
          org.h2.command.query.Query.getOffsetFetch (Query.java:912)
          org.h2.command.query.Select.queryWithoutCache (Select.java:768)
          org.h2.command.query.Query.queryWithoutCacheLazyCheck (Query.java:197)
          org.h2.command.query.Query.query (Query.java:512)
          ...
      ```
      
      BUT this only causes an issue for tests run under the cloverage
      linter. It's not an issue under any other test scenarios.
      
      Reproduction:
      -------------
      
      I can get the same stack traces with the following:
      
      ```clojure
      dashboards=> (clojure.java.jdbc/query {:datasource
                                             (:data-source metabase.db.connection/*application-db*)}
                                            ["select 1 limit null"])
      Execution error (JdbcSQLDataException) at org.h2.message.DbException/getJdbcSQLException (DbException.java:646).
      Invalid value "NULL" for parameter "result FETCH"; SQL statement:
      select 1 limit null [90008-214]
      
      dashboards=> (clojure.java.jdbc/query {:datasource
                                             (:data-source metabase.db.connection/*application-db*)}
                                            ["select 1 limit ?" nil])
      Execution error (JdbcSQLDataException) at org.h2.message.DbException/getJdbcSQLException (DbException.java:646).
      Invalid value "NULL" for parameter "result FETCH"; SQL statement:
      select 1 limit ? [90008-214]
      ```
      
      But with a normal repl, running the test does not error:
      
      ```clojure
      (deftest all-queries-test
        (mt/with-test-user :crowberto
          (with-temp-objects [objects]
            (premium-features-test/with-premium-features #{:audit-app}
                                   ;; limit to just the version
              (doseq [query-type #_(all-query-methods) [:metabase-enterprise.audit-app.pages.dashboards/most-popular-with-avg-speed]]
                (testing query-type
                  (do-tests-for-query-type query-type objects)))))))
      
      pages-test=> (clojure.test/run-test all-queries-test)
      
      Testing metabase-enterprise.audit-app.pages-test
      
      Ran 1 tests containing 1 assertions.
      0 failures, 0 errors.
      {:test 1, :pass 1, :fail 0, :error 0, :type :summary}
      ```
      
      And even more frustrating, just grabbing the sql and params and calling
      the same code:
      
      ```clojure
      common=> (let [driver (mdb/db-type)
                     sql check/sql
                     params check/params]
                 (with-open [conn (.getConnection mdb.connection/*application-db*)
                             stmt (doto (sql-jdbc.execute/prepared-statement driver conn sql params)
                                    foo
                                    )
                             rs   (sql-jdbc.execute/execute-prepared-statement! driver stmt)]
                   (into [] (clojure.java.jdbc/result-set-seq rs))))
      []
      ```
      
      I can get the test to fail by running the cloverage runner with a socket
      repl started and then running the test during that instrumented run:
      
      ```
      clojure -J"$(socket-repl 6000)" -X:dev:ee:ee-dev:test:cloverage
      ```
      
      But I'm not able to find anything interesting. It's a bug in CTEs with
      parameters in h2, and it seems that only this query is sensitive.
      
      If I inline the value it works under 2.1.214. And if I bump the version
      to 2.2.224 it works. So I'm bumping the version.
      
      * inline the offending value
      
      * Revert "Bump h2 to fix flaky test"
      
      This reverts commit 9cd6aed19cc4c143f79e611bb0c92218f18da365.
      Unverified
      fa764262
    • Case Nelson's avatar
      [MLv2] Handle coalesce type after conversion (#33814) · bf45689c
      Case Nelson authored
      
      * [MLv2] Handle coalesce type after conversion
      
      * Update test/metabase/lib/aggregation_test.cljc
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      
      * Only update fields on converted queries
      
      * Don't convert if base-type exists
      
      * Guard against non-numeric field ids
      
      * Fix test with hardcoded ids
      
      * Allow metadata/field to return nil on not found
      
      * Add ipaddress and mongobsonid types to equality-comparable-types
      
      * Support is-empty not-empty for mongobsonid
      
      * Add base-type to FE tests
      
      * Fix e2e tests - column metadata is valid with field types filled in
      
      ---------
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      Unverified
      bf45689c
    • Case Nelson's avatar
      [MLv2] Add function for database engine (#33942) · 1815233a
      Case Nelson authored
      
      * [MLv2] Add function for database engine
      
      * Update src/metabase/lib/js.cljs
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      
      * Use native query
      
      ---------
      
      Co-authored-by: default avatarmetamben <103100869+metamben@users.noreply.github.com>
      Unverified
      1815233a
    • Case Nelson's avatar
      [MLv2] Fix template-tags snippet-id (#33902) · d7de9380
      Case Nelson authored
      * [MLv2] Fix template-tags snippet-id
      
      Looking at this bug I found there were a couple problems that the
      converters were not properly handling. First, the conversion to js was
      copying snake case to kebab case instead of renaming. Second the conversion to
      clojure was not picking up kebab case keys but using the snake case keys.
      
      So I dropped the TemplateTag encoder and decoder and went forward with
      js->clj and clj->js since the only tricky part was keeping the tag names
      as strings and keeping the :type as keyword.
      
      * Address review comments
      Unverified
      d7de9380
    • Uladzimir Havenchyk's avatar
    • Uladzimir Havenchyk's avatar
    • Nemanja Glumac's avatar
      Fix archived items tooltip that's missing the item type (#33970) · ff0ab8bb
      Nemanja Glumac authored
      * Make archived item rely on the `model` for its type
      
      The `type` prop doesn't exist in the list of items.
      `models` on the other hand exists and it shows the underlying item model,
      which almost 1:1 matches the desired type.
      
      Almost, because we can questions "card" in the code.
      That will be accounted for in the next commit.
      
      * Handle `card` and `dataset` models
      
      Show "question` and "model" in the UI instead.
      
      * Move `getTranslatedEntityName` to the common utils
      
      * Use translated entities for the archived item tooltip
      
      * Switch back to the implicit return
      
      * Lowercase the translated model name
      
      * Add missing string translation
      
      * Add unit test for `getTranslatedEntityName`
      Unverified
      ff0ab8bb
Loading