diff --git a/docs/administration-guide/13-embedding.md b/docs/administration-guide/13-embedding.md index 87ebe61da7b368bd0d10fe62885ed583c1806202..f13cf0cc780d5e1df653b7a62ca515444a7bfa9f 100644 --- a/docs/administration-guide/13-embedding.md +++ b/docs/administration-guide/13-embedding.md @@ -63,6 +63,16 @@ If you wish to have a parameter locked down to prevent your embedding applicatio  +When using locked Field Filters with multiple selected values, then it is provided as a JSON array. Example: + +``` +... +params: { + foo: ['Value1', 'Value2'], +}, +... +``` + ### Hiding parameters If you have parameters that aren't required, but that you'd like to be hidden, instead of marking them as Locked you can use the `hide_parameters` URL option to hide one or more parameters (i.e., prevent it from showing up as a filter widget on screen). You'll want to add this option to the Metabase URL specified in your embedding iframe. diff --git a/docs/operations-guide/configuring-application-database.md b/docs/operations-guide/configuring-application-database.md index ff189cb1178023c5041f91dcf9e5f6bae2dab990..50d10c53d76d7f755a9133d1abb1e172249d8cfc 100644 --- a/docs/operations-guide/configuring-application-database.md +++ b/docs/operations-guide/configuring-application-database.md @@ -48,6 +48,10 @@ You can change the application database to use Postgres using a few simple envir export MB_DB_HOST=localhost java -jar metabase.jar +Metabase will not create this database for you. Example command to create the database: + + createdb --encoding=UTF8 -e metabase + This will tell Metabase to look for its application database using the supplied Postgres connection information. Metabase also supports providing a full JDBC connection URI if you have additional parameters: export MB_DB_CONNECTION_URI="postgres://localhost:5432/metabase?user=<username>&password=<password>" @@ -65,6 +69,10 @@ If you prefer to use MySQL or MariaDB we've got you covered. The minimum recomme export MB_DB_HOST=localhost java -jar metabase.jar +Metabase will not create this database for you. Example SQL statement to create the database: + + CREATE DATABASE metabase CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; + This will tell Metabase to look for its application database using the supplied MySQL connection information. Metabase also supports providing a full JDBC connection URI if you have additional parameters: export MB_DB_CONNECTION_URI="mysql://localhost:3306/metabase?user=<username>&password=<password>" diff --git a/docs/operations-guide/environment-variables.md b/docs/operations-guide/environment-variables.md index 50ecfb32342a39463a1edbe70275549f9c1fdf00..7ca0208e82ceed5be70510ab5f879bea7a30b9f2 100644 --- a/docs/operations-guide/environment-variables.md +++ b/docs/operations-guide/environment-variables.md @@ -40,7 +40,7 @@ Session expiration, defined in minutes (default is 2 weeks), which will log out Note: This setting is not an idle/inactivity timeout. If you set this to 15 minutes, your users have to login (or re-authenticate) again every 15 minutes. -Use [MB_SESSION_COOKIES](#MB_SESSION_COOKIES) to also expire sessions, when browser is closed. +Use [MB_SESSION_COOKIES](#mb_session_cookies) to also expire sessions, when browser is closed. Also see the [Changing session expiration](changing-session-expiration.md) documentation page. @@ -85,7 +85,7 @@ Change this to a higher value if you notice that regular usage consumes all or c To see how many connections are being used, check the Metabase logs and look for lines that contains the following: `… App DB connections: 12/15 …`. In this example, 12 out of 15 available connections are being used. -See [MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE](#MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE) for setting maximum connections to the databases connected to Metabase. +See [MB_JDBC_DATA_WAREHOUSE_MAX_CONNECTION_POOL_SIZE](#mb_jdbc_data_warehouse_max_connection_pool_size) for setting maximum connections to the databases connected to Metabase. #### `MB_APPLICATION_FAVICON_URL` @@ -117,7 +117,7 @@ Type: integer<br> Default: `50`<br> Since: 0.35.0 -Maximum number of async Jetty threads. If not set, then [MB_JETTY_MAXTHREADS](#MB_JETTY_MAXTHREADS) will be used, otherwise it will use the default. +Maximum number of async Jetty threads. If not set, then [MB_JETTY_MAXTHREADS](#mb_jetty_maxthreads) will be used, otherwise it will use the default. #### `MB_BREAKOUT_BIN_WIDTH` @@ -152,7 +152,7 @@ Used to recognize the Mac App client, which then defaults to `"OSX"`. Type: boolean<br> Default: `true` -Color log lines. When set to `false` it will disable log line colors. This is disabled on Windows. Related to [MB_EMOJI_IN_LOGS](#MB_EMOJI_IN_LOGS). +Color log lines. When set to `false` it will disable log line colors. This is disabled on Windows. Related to [MB_EMOJI_IN_LOGS](#mb_emoji_in_logs). #### `MB_CUSTOM_FORMATTING` @@ -187,7 +187,7 @@ Timeout in milliseconds for connecting to the application database. Type: string<br> Default: `null` -A JDBC-style connection URI that can be used instead of most of `MB_DB_*` like [MB_DB_HOST](#MB_DB_HOST). Also used when certain Connection String parameters are required for the connection. The connection type requirement is the same as [MB_DB_TYPE](#MB_DB_TYPE). +A JDBC-style connection URI that can be used instead of most of `MB_DB_*` like [MB_DB_HOST](#mb_db_host). Also used when certain Connection String parameters are required for the connection. The connection type requirement is the same as [MB_DB_TYPE](#mb_db_type). Example: `postgres://dbuser:dbpassword@db.example.com:port/mydb?ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory` @@ -196,14 +196,14 @@ Example: `postgres://dbuser:dbpassword@db.example.com:port/mydb?ssl=true&sslfact Type: string<br> Default: `null` -The database name of the application database used with [MB_DB_HOST](#MB_DB_HOST). +The database name of the application database used with [MB_DB_HOST](#mb_db_host). #### `MB_DB_FILE` Type: string<br> Default: `"metabase.db"` -Location of H2 database file. Should not include the `.mv.db` (or `.h2.db`) file extension. Used when [MB_DB_TYPE](#MB_DB_TYPE) is set to`"h2"`. +Location of H2 database file. Should not include the `.mv.db` (or `.h2.db`) file extension. Used when [MB_DB_TYPE](#mb_db_type) is set to`"h2"`. Can also be used when migrating away from H2 to specify where the existing data should be read from. @@ -212,42 +212,42 @@ Can also be used when migrating away from H2 to specify where the existing data Type: string<br> Default: `null` -The host name or IP address of the application database. Used when [MB_DB_TYPE](#MB_DB_TYPE) is different than `"h2"`. +The host name or IP address of the application database. Used when [MB_DB_TYPE](#mb_db_type) is different than `"h2"`. #### `MB_DB_IN_MEMORY` Type: boolean<br> Default: `null` -Used for testing with [MB_DB_FILE](#MB_DB_FILE). +Used for testing with [MB_DB_FILE](#mb_db_file). #### `MB_DB_PASS` Type: string<br> Default: `null` -The password for [MB_DB_HOST](#MB_DB_HOST). +The password for [MB_DB_HOST](#mb_db_host). #### `MB_DB_PORT` Type: integer<br> Default: `null` -The port for [MB_DB_HOST](#MB_DB_HOST). +The port for [MB_DB_HOST](#mb_db_host). #### `MB_DB_TYPE` Type: string (`"h2"`, `"postgres"`, `"mysql"`)<br> Default: `"h2"` -When `"h2"`, the application database is loaded from [MB_DB_FILE](#MB_DB_FILE), otherwise [MB_DB_HOST](#MB_DB_HOST) will be used to define application database. +When `"h2"`, the application database is loaded from [MB_DB_FILE](#mb_db_file), otherwise [MB_DB_HOST](#mb_db_host) will be used to define application database. #### `MB_DB_USER` Type: string<br> Default: `null` -The username for [MB_DB_HOST](#MB_DB_HOST). +The username for [MB_DB_HOST](#mb_db_host). #### `MB_DISABLE_SESSION_THROTTLE` @@ -256,7 +256,7 @@ Default: `false` When `true`, this will disable session throttling. **Warning:** It is not recommended to disable throttling, since it is a protective measure against brute-force attacks. -Use [MB_SOURCE_ADDRESS_HEADER](#MB_SOURCE_ADDRESS_HEADER) to set the IP address of the remote client from e.g. a reverse-proxy. +Use [MB_SOURCE_ADDRESS_HEADER](#mb_source_address_header) to set the IP address of the remote client from e.g. a reverse-proxy. #### `MB_EMAIL_FROM_ADDRESS` @@ -313,7 +313,7 @@ URL of origin allowed to embed the full Metabase application. Type: boolean<br> Default: `true` -Emojis on log lines. When set to `false` it will disable log line emojis. This is disabled on Windows. Related to [MB_COLORIZE_LOGS](#MB_COLORIZE_LOGS). +Emojis on log lines. When set to `false` it will disable log line emojis. This is disabled on Windows. Related to [MB_COLORIZE_LOGS](#mb_colorize_logs). #### `MB_ENABLE_EMBEDDING` @@ -398,7 +398,7 @@ Maximum number of connections to the data source databases. The maximum is for e Change this to a higher value if you notice that regular usage consumes all or close to all connections. When all connections are in use then Metabase will be slower to return results for queries, since it would have to wait for an available connection before processing the next query in the queue. -See [MB_APPLICATION_DB_MAX_CONNECTION_POOL_SIZE](#MB_APPLICATION_DB_MAX_CONNECTION_POOL_SIZE) for setting maximum connections to the Metabase application database. +See [MB_APPLICATION_DB_MAX_CONNECTION_POOL_SIZE](#mb_application_db_max_connection_pool_size) for setting maximum connections to the Metabase application database. #### `MB_JETTY_ASYNC_RESPONSE_TIMEOUT` @@ -420,7 +420,7 @@ Use daemon threads. Type: string<br> Default: `localhost` for JAR, `0.0.0.0` for Docker -Configure a host either as a host name or IP address to identify a specific network interface on which to listen. If set to `"0.0.0.0"`, Metabase listens on all network interfaces. It will listen on the port specified in [MB_JETTY_PORT](#MB_JETTY_PORT). +Configure a host either as a host name or IP address to identify a specific network interface on which to listen. If set to `"0.0.0.0"`, Metabase listens on all network interfaces. It will listen on the port specified in [MB_JETTY_PORT](#mb_jetty_port). #### `MB_JETTY_JOIN` @@ -429,14 +429,6 @@ Default: `true` Blocks the thread until server ends. -#### `MB_JETTY_MAX_REQUEST_HEADER_SIZE` - -Type: integer<br> -Default: `8192`<br> -Since: 0.36.0 - -Maximum size of a request header, in bytes. Increase this value if you are experiencing errors like "Request Header Fields Too Large". - #### `MB_JETTY_MAXIDLETIME` Type: integer<br> @@ -462,7 +454,7 @@ Change this to a higher value if you notice that regular usage consumes all or c To see how many threads are being used, check the Metabase logs and look for lines that contain the following: `… Jetty threads: 45/50 …`, which in this case would indicate 45 out of 50 available threads are being used. -Related [MB_ASYNC_QUERY_THREAD_POOL_SIZE](#MB_ASYNC_QUERY_THREAD_POOL_SIZE). +Related [MB_ASYNC_QUERY_THREAD_POOL_SIZE](#mb_async_query_thread_pool_size). #### `MB_JETTY_MINTHREADS` @@ -476,7 +468,15 @@ Minimum number of threads. Type: integer<br> Default: `3000` -Configure which port to use for HTTP. It will listen on the interface specified in [MB_JETTY_HOST](#MB_JETTY_HOST). +Configure which port to use for HTTP. It will listen on the interface specified in [MB_JETTY_HOST](#mb_jetty_host). + +#### `MB_JETTY_REQUEST_HEADER_SIZE` + +Type: integer<br> +Default: `8192`<br> +Since: 0.36.0 + +Maximum size of a request header, in bytes. Increase this value if you are experiencing errors like "Request Header Fields Too Large". #### `MB_JETTY_SSL` @@ -506,7 +506,7 @@ Password for Java KeyStore file. Type: integer<br> Default: `null` -Configure which port to use for HTTPS. It will listen on the interface specified in [MB_JETTY_HOST](#MB_JETTY_HOST). +Configure which port to use for HTTPS. It will listen on the interface specified in [MB_JETTY_HOST](#mb_jetty_host). #### `MB_JETTY_SSL_TRUSTSTORE` @@ -742,7 +742,7 @@ Comma-separated namespaces to trace. **WARNING:** Could log sensitive informatio Type: string (`"weak"`, `"normal"`, `"strong"`)<br> Default: `"normal"` -Enforce a password complexity rule to increase security for regular logins. This only applies to new users or users that are changing their password. Related [MB_PASSWORD_LENGTH](#MB_PASSWORD_LENGTH) +Enforce a password complexity rule to increase security for regular logins. This only applies to new users or users that are changing their password. Related [MB_PASSWORD_LENGTH](#mb_password_length) - `weak` requires length of 6 characters - `normal` requires at least 1 digit and length of 6 characters @@ -753,7 +753,7 @@ Enforce a password complexity rule to increase security for regular logins. This Type: integer<br> Default: `6` -Set a minimum password length to increase security for regular logins. This only applies to new users or users that are changing their password. Uses the length of [MB_PASSWORD_COMPLEXITY](#MB_PASSWORD_COMPLEXITY) if not set. +Set a minimum password length to increase security for regular logins. This only applies to new users or users that are changing their password. Uses the length of [MB_PASSWORD_COMPLEXITY](#mb_password_complexity) if not set. #### `MB_PLUGINS_DIR` @@ -769,7 +769,7 @@ The location is where custom third-party drivers should be added. Then Metabase Type: string<br> Default: `null` -Token for premium features. +Token for Enterprise Edition and Premium features. #### `MB_QP_CACHE_BACKEND` @@ -812,7 +812,7 @@ Type: boolean<br> Default: `false`<br> Since: 0.36.0 -Force all traffic to use HTTPS via a redirect, if the site URL is HTTPS. Related [MB_SITE_URL](#MB_SITE_URL) +Force all traffic to use HTTPS via a redirect, if the site URL is HTTPS. Related [MB_SITE_URL](#mb_site_url) #### `MB_REPORT_TIMEZONE` @@ -938,7 +938,7 @@ Send email notifications to users in Admin group, when a new SSO users is create Type: boolean<br> Default: `null` -When set to `true`, the user login session will expire, when the browser is closed. The user login session will always expire after the amount of time defined in [MAX_SESSION_AGE](#MAX_SESSION_AGE) (by default 2 weeks). +When set to `true`, the user login session will expire, when the browser is closed. The user login session will always expire after the amount of time defined in [MAX_SESSION_AGE](#max_session_age) (by default 2 weeks). Also see the [Changing session expiration](changing-session-expiration.md) documentation page. @@ -1003,7 +1003,7 @@ Slack API bearer token obtained from https://api.slack.com/web#authentication Type: string<br> Default: `X-Forwarded-For` -Identify the source of HTTP requests by this header's value, instead of its remote address. Related to [MB_DISABLE_SESSION_THROTTLE](#MB_DISABLE_SESSION_THROTTLE). +Identify the source of HTTP requests by this header's value, instead of its remote address. Related to [MB_DISABLE_SESSION_THROTTLE](#mb_disable_session_throttle). #### `MB_SSL_CERTIFICATE_PUBLIC_KEY` diff --git a/frontend/src/metabase/auth/components/GoogleButton.jsx b/frontend/src/metabase/auth/components/GoogleButton.jsx index 091a76d73a4f3e87b0dd2d6669a73ebf52d2be2c..4acff8e30dbda89857d434f29fc7d9b1f4518f5d 100644 --- a/frontend/src/metabase/auth/components/GoogleButton.jsx +++ b/frontend/src/metabase/auth/components/GoogleButton.jsx @@ -40,15 +40,38 @@ export default class GoogleButton extends Component { auth2.attachClickHandler( element, {}, - googleUser => { + async googleUser => { this.setState({ errorMessage: null }); - loginGoogle(googleUser, location.query.redirect); + const result = await loginGoogle( + googleUser, + location.query.redirect, + ); + + if ( + result.payload["status"] && + result.payload["status"] === 400 && + result.payload.data && + result.payload.data.errors + ) { + let errorMessage = ""; + for (const [, value] of Object.entries( + result.payload.data.errors, + )) { + if (errorMessage !== "") { + errorMessage = errorMessage + "<br/>"; + } + errorMessage = errorMessage + value; + } + this.setState({ + errorMessage: errorMessage, + }); + } }, error => { this.setState({ errorMessage: GOOGLE_AUTH_ERRORS[error.error] || - t`There was an issue signing in with Google. Pleast contact an administrator.`, + t`There was an issue signing in with Google. Please contact an administrator.`, }); }, ); diff --git a/frontend/src/metabase/lib/formatting.js b/frontend/src/metabase/lib/formatting.js index faa52d569a5d1f2f34f3045f843d303316669ff5..3173d847eb88b4bb1f5d6a25f5bb4ff5aa1eed26 100644 --- a/frontend/src/metabase/lib/formatting.js +++ b/frontend/src/metabase/lib/formatting.js @@ -842,7 +842,7 @@ export function stripId(name: string) { } export function slugify(name: string) { - return name && name.toLowerCase().replace(/[^a-z\u0400-\u04ff0-9_]/g, "_"); + return name && encodeURIComponent(name.toLowerCase().replace(/\s/g, "_")); } export function assignUserColors( diff --git a/frontend/test/metabase/lib/formatting.unit.spec.js b/frontend/test/metabase/lib/formatting.unit.spec.js index b0e38eee38a0d42019c8060f711850029a6bfb0e..4c533e7f8869087e4d8daf85b628e1507ff2d19b 100644 --- a/frontend/test/metabase/lib/formatting.unit.spec.js +++ b/frontend/test/metabase/lib/formatting.unit.spec.js @@ -6,6 +6,7 @@ import { formatValue, formatUrl, formatDateTimeWithUnit, + slugify, } from "metabase/lib/formatting"; import ExternalLink from "metabase/components/ExternalLink"; import { TYPE } from "metabase/lib/types"; @@ -344,4 +345,24 @@ describe("formatting", () => { } }); }); + + describe("slugify", () => { + it("should slugify Chinese", () => { + expect(slugify("é¡žåž‹")).toEqual("%E9%A1%9E%E5%9E%8B"); + }); + + it("should slugify multiple words", () => { + expect(slugify("Test Parameter")).toEqual("test_parameter"); + }); + + it("should slugify Russian", () => { + expect(slugify("руÑÑкий Ñзык")).toEqual( + "%D1%80%D1%83%D1%81%D1%81%D0%BA%D0%B8%D0%B9_%D1%8F%D0%B7%D1%8B%D0%BA", + ); + }); + + it("should slugify diacritics", () => { + expect(slugify("än umlaut")).toEqual("%C3%A4n_umlaut"); + }); + }); }); diff --git a/frontend/test/metabase/scenarios/admin/settings/spinner.cy.spec.js b/frontend/test/metabase/scenarios/admin/settings/spinner.cy.spec.js new file mode 100644 index 0000000000000000000000000000000000000000..efca0c86756282e7421b309a816f528612c6ad46 --- /dev/null +++ b/frontend/test/metabase/scenarios/admin/settings/spinner.cy.spec.js @@ -0,0 +1,20 @@ +import { restore, signInAsAdmin } from "__support__/cypress"; + +describe("scenarios > admin > spinner", () => { + before(restore); + beforeEach(signInAsAdmin); + + describe("API request", () => { + it("should return correct DB", () => { + cy.visit("/admin/databases/1"); + cy.findByText("Sample Dataset"); + cy.findByText("Add Database").should("not.exist"); + }); + + it.skip("should not spin forever if it returns an error (Issue #11037)", () => { + cy.visit("/admin/databases/999"); + cy.findAllByText("Databases").should("have.length", 2); + cy.findByText("Loading...").should("not.exist"); + }); + }); +}); diff --git a/frontend/test/metabase/scenarios/auth/search.cy.spec.js b/frontend/test/metabase/scenarios/auth/search.cy.spec.js index c19df331d27eca0c87172694282eb2484b3f1ae2..a63c70b46f619e7736851abe0480b3a0e4548927 100644 --- a/frontend/test/metabase/scenarios/auth/search.cy.spec.js +++ b/frontend/test/metabase/scenarios/auth/search.cy.spec.js @@ -16,7 +16,7 @@ describe("scenarios > auth > search", () => { cy.findByText("PRODUCTS"); }); - it.skip("should work for user with permissions (Issue #12332)", () => { + it("should work for user with permissions (Issue #12332)", () => { signInAsNormalUser(); cy.visit("/"); cy.findByPlaceholderText("Search…").type("product{enter}"); diff --git a/frontend/test/metabase/scenarios/dashboard/parameters.cy.spec.js b/frontend/test/metabase/scenarios/dashboard/parameters.cy.spec.js index eca3884dc06b42bf815278fcdec15e32f3f3c105..faf69f0138957486f808270d2a2ec823abc1f042 100644 --- a/frontend/test/metabase/scenarios/dashboard/parameters.cy.spec.js +++ b/frontend/test/metabase/scenarios/dashboard/parameters.cy.spec.js @@ -1,10 +1,17 @@ -import { signInAsAdmin, modal, popover, restore } from "__support__/cypress"; +import { + signInAsAdmin, + modal, + popover, + restore, + openOrdersTable, +} from "__support__/cypress"; // NOTE: some overlap with parameters-embedded.cy.spec.js describe("scenarios > dashboard > parameters", () => { before(restore); beforeEach(signInAsAdmin); + it("should search across multiple fields", () => { // create a new dashboard cy.visit("/"); @@ -63,6 +70,59 @@ describe("scenarios > dashboard > parameters", () => { .last() .contains("4,939"); }); + + it("should filter on a UNIX timestamp", () => { + // Set datatype of Quantity to UNIX Timestamp + cy.visit("/admin/datamodel/database/1/table/2"); + cy.findByText("Quantity").click(); + cy.scrollTo("bottom"); + cy.findByText("UNIX Timestamp (Seconds)").click(); + + // Create a question + openOrdersTable(); + cy.findByText("Summarize").click(); + cy.findByText("Summarize by"); + cy.findByText("Done").click(); + + // Add to a dashboard + cy.findByText("Save").click(); + cy.findAllByText("Save") + .last() + .click(); + cy.findByText("Yes please!").click(); + cy.findByText("Create a new dashboard").click(); + cy.findByPlaceholderText("What is the name of your dashboard?").type( + "Test Dashboard", + ); + cy.findByText("Create").click(); + + // Add filter + cy.get(".Icon-funnel_add").click(); + cy.findByText("Time").click(); + cy.findByText("Relative Date").click(); + + cy.findByText("Select…").click(); + cy.get(".List-item .cursor-pointer") + .first() + .click({ force: true }); + cy.findByText("Select…").should("not.exist"); + + cy.findByText("Select a default value…").click(); + cy.findByText("Yesterday").click(); + + // Save dashboard + cy.findByText("Done").click(); + cy.findByText("Save").click({ force: true }); + cy.findByText("Save").should("not.exist"); + cy.findByText("0"); + + // Open dashboard from collections + cy.visit("/collection/root"); + cy.findByText("Test Dashboard").click(); + cy.findByText("Relative Date"); + cy.findByText("18,760").should("not.exist"); + cy.findByText("0"); + }); }); function selectFilter(selection, filterName) { diff --git a/frontend/test/metabase/scenarios/dashboard/x-rays.cy.spec.js b/frontend/test/metabase/scenarios/dashboard/x-rays.cy.spec.js new file mode 100644 index 0000000000000000000000000000000000000000..86fa173306136a6adc3641923eacee39e6dc8f26 --- /dev/null +++ b/frontend/test/metabase/scenarios/dashboard/x-rays.cy.spec.js @@ -0,0 +1,25 @@ +import { restore, signInAsAdmin } from "../../../__support__/cypress"; + +describe("scenarios > x-rays", () => { + before(restore); + beforeEach(signInAsAdmin); + + it("should exist on homepage when person first signs in", () => { + cy.visit("/"); + cy.contains("A look at your People table"); + cy.contains("A look at your Orders table"); + cy.contains("A look at your Products table"); + cy.contains("A look at your Reviews table"); + }); + + it("should be populated", () => { + cy.visit("/"); + cy.findByText("People table").click(); + + cy.findByText("Something's gone wrong").should("not.exist"); + cy.findByText("Here's an overview of the people in your People table"); + cy.findByText("Overview"); + cy.findByText("Per state"); + cy.get(".Card").should("have.length", 11); + }); +}); diff --git a/modules/drivers/bigquery/src/metabase/driver/bigquery.clj b/modules/drivers/bigquery/src/metabase/driver/bigquery.clj index 513a903587e13b33f53cecf8891e3c9a3a507d3f..edced5c000b65de76886e25b22d1a153bf0cf5e1 100644 --- a/modules/drivers/bigquery/src/metabase/driver/bigquery.clj +++ b/modules/drivers/bigquery/src/metabase/driver/bigquery.clj @@ -189,8 +189,9 @@ (doall (for [^TableFieldSchema field (.getFields schema) :let [column-type (.getType field) + column-mode (.getMode field) method (get-method bigquery.qp/parse-result-of-type column-type)]] - (partial method column-type bigquery.common/*bigquery-timezone-id*))) + (partial method column-type column-mode bigquery.common/*bigquery-timezone-id*))) columns (for [column (table-schema->metabase-field-info schema)] diff --git a/modules/drivers/bigquery/src/metabase/driver/bigquery/query_processor.clj b/modules/drivers/bigquery/src/metabase/driver/bigquery/query_processor.clj index c96bb51e46df2db87a89224be74911726ecd506a..e5546f09fb858741e471e7d80dc2da312d81ec75 100644 --- a/modules/drivers/bigquery/src/metabase/driver/bigquery/query_processor.clj +++ b/modules/drivers/bigquery/src/metabase/driver/bigquery/query_processor.clj @@ -55,28 +55,38 @@ (defmulti parse-result-of-type "Parse the values that come back in results of a BigQuery query based on their column type." - {:arglists '([column-type timezone-id v])} - (fn [column-type _ _] column-type)) + {:arglists '([column-type column-mode timezone-id v])} + (fn [column-type _ _ _] column-type)) + +(defn- parse-value + [column-mode v parse-fn] + ;; For results from a query like `SELECT [1,2]`, BigQuery sets the column-mode to `REPEATED` and wraps the column in an ArrayList, + ;; with ArrayMap entries, like: `ArrayList(ArrayMap("v", 1), ArrayMap("v", 2))` + (if (= "REPEATED" column-mode) + (for [result v + ^java.util.Map$Entry entry result] + (parse-fn (.getValue entry))) + (parse-fn v))) (defmethod parse-result-of-type :default - [_ _ v] - v) + [_ column-mode _ v] + (parse-value column-mode v identity)) (defmethod parse-result-of-type "BOOLEAN" - [_ _ v] - (Boolean/parseBoolean v)) + [_ column-mode _ v] + (parse-value column-mode v #(Boolean/parseBoolean %))) (defmethod parse-result-of-type "FLOAT" - [_ _ v] - (Double/parseDouble v)) + [_ column-mode _ v] + (parse-value column-mode v #(Double/parseDouble %))) (defmethod parse-result-of-type "INTEGER" - [_ _ v] - (Long/parseLong v)) + [_ column-mode _ v] + (parse-value column-mode v #(Long/parseLong %))) (defmethod parse-result-of-type "NUMERIC" - [_ _ v] - (bigdec v)) + [_ column-mode _ v] + (parse-value column-mode v bigdec)) (defn- parse-timestamp-str [timezone-id s] ;; Timestamp strings either come back as ISO-8601 strings or Unix timestamps in µs, e.g. "1.3963104E9" @@ -86,20 +96,20 @@ (u.date/parse s timezone-id))) (defmethod parse-result-of-type "DATE" - [_ timezone-id s] - (parse-timestamp-str timezone-id s)) + [_ column-mode timezone-id v] + (parse-value column-mode v (partial parse-timestamp-str timezone-id))) (defmethod parse-result-of-type "DATETIME" - [_ timezone-id s] - (parse-timestamp-str timezone-id s)) + [_ column-mode timezone-id v] + (parse-value column-mode v (partial parse-timestamp-str timezone-id))) (defmethod parse-result-of-type "TIMESTAMP" - [_ timezone-id s] - (parse-timestamp-str timezone-id s)) + [_ column-mode timezone-id v] + (parse-value column-mode v (partial parse-timestamp-str timezone-id))) (defmethod parse-result-of-type "TIME" - [_ timezone-id s] - (u.date/parse s timezone-id)) + [_ column-mode timezone-id v] + (parse-value column-mode v (fn [v] (u.date/parse v timezone-id)))) ;;; +----------------------------------------------------------------------------------------------------------------+ diff --git a/modules/drivers/bigquery/test/metabase/driver/bigquery/query_processor_test.clj b/modules/drivers/bigquery/test/metabase/driver/bigquery/query_processor_test.clj index 53b07e2b8793c0d232ef0bb99ccb02fe9d0db5e1..586dedbdfb31d1f069a5e9c759e86da8ddda5020 100644 --- a/modules/drivers/bigquery/test/metabase/driver/bigquery/query_processor_test.clj +++ b/modules/drivers/bigquery/test/metabase/driver/bigquery/query_processor_test.clj @@ -59,6 +59,30 @@ "FROM `v2_test_data.checkins` " "LIMIT 2")} :type :native + :database (mt/id)}))))) + + (testing "queries with array result columns deserialize properly (issue #10275)" + (is (= [[["foo" "bar"] + [1 2] + [3.14159265359 0.5772156649] + [1234M 5678M] + [#t "2018-01-01T00:00Z[UTC]" #t "2018-12-31T00:00Z[UTC]"] + [#t "12:34" #t "20:01:13.230"] + [#t "1957-05-17T03:35Z[UTC]" #t "2018-06-01T01:15:34.120Z[UTC]"] + [#t "2014-09-27T20:30:00.450Z[UTC]" #t "2020-09-27T14:57:00.450Z[UTC]"] + []]] + (mt/rows + (qp/process-query + {:native {:query (str "SELECT ['foo', 'bar'], " + "[1, 2], " + "[3.14159265359, 0.5772156649], " + "[NUMERIC '1234', NUMERIC '5678'], " + "[DATE '2018-01-01', DATE '2018-12-31'], " + "[TIME '12:34:00.00', TIME '20:01:13.23'], " + "[DATETIME '1957-05-17 03:35:00.00', DATETIME '2018-06-01 01:15:34.12'], " + "[TIMESTAMP '2014-09-27 12:30:00.45-08', TIMESTAMP '2020-09-27 09:57:00.45-05'], " + "[]")} + :type :native :database (mt/id)}))))))) (deftest aggregations-test diff --git a/modules/drivers/mongo/src/metabase/driver/mongo/query_processor.clj b/modules/drivers/mongo/src/metabase/driver/mongo/query_processor.clj index ab04ad653a2bd0993ad06b73c86fb0c961675497..3825361c84614b1b02d6c8f81939c14d1c71d849 100644 --- a/modules/drivers/mongo/src/metabase/driver/mongo/query_processor.clj +++ b/modules/drivers/mongo/src/metabase/driver/mongo/query_processor.clj @@ -226,7 +226,10 @@ (defmethod ->rvalue :value [[_ value {base-type :base_type}]] - (if (isa? base-type :type/MongoBSONID) + (if (and (isa? base-type :type/MongoBSONID) + (some? value)) + ;; Passing a nil to the ObjectId constructor throws an exception + ;; "invalid hexadecimal representation of an ObjectId: []" so, just treat it as nil (ObjectId. (str value)) value)) diff --git a/modules/drivers/mongo/test/metabase/driver/mongo_test.clj b/modules/drivers/mongo/test/metabase/driver/mongo_test.clj index c232fcd091bedaf90e7c03a1d51f52c21195985f..6461bfe98a7468fd67f3b168eb90d6a1932b7d06 100644 --- a/modules/drivers/mongo/test/metabase/driver/mongo_test.clj +++ b/modules/drivers/mongo/test/metabase/driver/mongo_test.clj @@ -219,15 +219,23 @@ [{:field-name "name", :base-type :type/Text} {:field-name "bird_id", :base-type :type/MongoBSONID}] [["Rasta Toucan" (ObjectId. "012345678901234567890123")] - ["Lucky Pigeon" (ObjectId. "abcdefabcdefabcdefabcdef")]]]]) + ["Lucky Pigeon" (ObjectId. "abcdefabcdefabcdefabcdef")] + ["Unlucky Raven" nil]]]]) (deftest bson-ids-test (mt/test-driver :mongo - (is (= [[2 "Lucky Pigeon" (ObjectId. "abcdefabcdefabcdefabcdef")]] - (rows (mt/dataset with-bson-ids - (mt/run-mbql-query birds - {:filter [:= $bird_id "abcdefabcdefabcdefabcdef"]})))) - "Check that we support Mongo BSON ID and can filter by it (#1367)"))) + (testing "BSON IDs" + (is (= [[2 "Lucky Pigeon" (ObjectId. "abcdefabcdefabcdefabcdef")]] + (rows (mt/dataset with-bson-ids + (mt/run-mbql-query birds + {:filter [:= $bird_id "abcdefabcdefabcdefabcdef"]})))) + "Check that we support Mongo BSON ID and can filter by it (#1367)") + + (is (= [[3 "Unlucky Raven" nil]] + (rows (mt/dataset with-bson-ids + (mt/run-mbql-query birds + {:filter [:is-null $bird_id]})))) + "handle null ObjectId queries properly (#11134)")))) (deftest bson-fn-call-forms-test (mt/test-driver :mongo diff --git a/project.clj b/project.clj index 118e1cea1052bca70518bfd82dbe429a960e7659..2b38eb5d914c2bb2e3d4b9b6101b21c0bc382470 100644 --- a/project.clj +++ b/project.clj @@ -64,7 +64,9 @@ :exclusions [commons-codec commons-io slingshot]] - [clojure.java-time "0.3.2"] ; Java 8 java.time wrapper + ;; fork to address #13102 - see upstream PR: https://github.com/dm3/clojure.java-time/pull/60 + ;; TODO: switch back to the upstream once a version is released with the above patch + [robdaemon/clojure.java-time "0.3.3-SNAPSHOT"] ; Java 8 java.time wrapper [clojurewerkz/quartzite "2.1.0" ; scheduling library :exclusions [c3p0]] [colorize "0.1.1" :exclusions [org.clojure/clojure]] ; string output with ANSI color codes (for logging) @@ -117,7 +119,7 @@ [org.flatland/ordered "1.5.9"] ; ordered maps & sets [org.liquibase/liquibase-core "3.6.3" ; migration management (Java lib) :exclusions [ch.qos.logback/logback-classic]] - [org.mariadb.jdbc/mariadb-java-client "2.5.1"] ; MySQL/MariaDB driver + [org.mariadb.jdbc/mariadb-java-client "2.6.2"] ; MySQL/MariaDB driver [org.postgresql/postgresql "42.2.8"] ; Postgres driver [org.slf4j/slf4j-api "1.7.30"] ; abstraction for logging frameworks -- allows end user to plug in desired logging framework at deployment time [org.slf4j/slf4j-log4j12 "1.7.30"] ; ^^ diff --git a/src/metabase/api/card.clj b/src/metabase/api/card.clj index ae4fd4d1dcd920c60bceecba5aa200d60b5fb73f..58e5ece6a5c29b500ac7de5f714dd42aa1a36826 100644 --- a/src/metabase/api/card.clj +++ b/src/metabase/api/card.clj @@ -611,8 +611,9 @@ (qp/process-query-and-save-execution! query info context)))}}] {:pre [(u/maybe? sequential? parameters)]} (let [card (api/read-check (Card card-id)) - query (assoc (query-for-card card parameters constraints middleware) - :async? true) + query (-> (assoc (query-for-card card parameters constraints middleware) + :async? true) + (assoc-in [:middleware :js-int-to-string?] true)) info {:executed-by api/*current-user-id* :context context :card-id card-id diff --git a/src/metabase/api/dataset.clj b/src/metabase/api/dataset.clj index fcce07f5322dcd8b7154fee107c8fb624d895631..67bd74d11796ef8115cad075982b88ed088eccb2 100644 --- a/src/metabase/api/dataset.clj +++ b/src/metabase/api/dataset.clj @@ -49,7 +49,8 @@ info {:executed-by api/*current-user-id* :context :ad-hoc :card-id source-card-id - :nested? (boolean source-card-id)}] + :nested? (boolean source-card-id)} + query (update query :middleware assoc :js-int-to-string? true)] (qp.streaming/streaming-response [context :api] (qp/process-query-and-save-with-max-results-constraints! query info context)))) diff --git a/src/metabase/api/query_description.clj b/src/metabase/api/query_description.clj index 3d3bc63fad9603380c08bdb15d6af3dc2fa33cc8..a095a4208addbf1b8852cb4a81a1b2a8744325f3 100644 --- a/src/metabase/api/query_description.clj +++ b/src/metabase/api/query_description.clj @@ -1,7 +1,9 @@ (ns metabase.api.query-description "Functions for generating human friendly query descriptions" (:require [clojure.string :as str] - [metabase.mbql.util :as mbql.u] + [metabase.mbql + [predicates :as mbql.preds] + [util :as mbql.u]] [metabase.models [field :refer [Field]] [metric :refer [Metric]] @@ -12,11 +14,11 @@ [metadata query] {:table (:display_name metadata)}) -(defn- get-aggregation-description +(defn- get-aggregation-details [metadata query] - (let [field-name (fn [match] (map - (fn [m] (:display_name (Field (mbql.u/field-clause->id-or-literal m)))) - (mbql.u/match match :field-id)))] + (let [field-name (fn [match] (or (when (mbql.preds/Field? match) + (:display_name (Field (mbql.u/field-clause->id-or-literal match)))) + (flatten (get-aggregation-details metadata match))))] (when-let [agg-matches (mbql.u/match query [:aggregation-options _ (options :guard :display-name)] {:type :aggregation :arg (:display-name options)} @@ -24,6 +26,9 @@ [:aggregation-options ag _] (recur ag) + [(operator :guard #{:+ :- :/ :*}) & args] + (interpose (name operator) (map field-name args)) + [:metric arg] {:type :metric :arg (let [metric (Metric arg)] (if (not (str/blank? (:name metric))) @@ -39,7 +44,12 @@ [:cum-sum arg] {:type :cum-sum :arg (field-name arg)} [:max arg] {:type :max :arg (field-name arg)} [:min arg] {:type :min :arg (field-name arg)})] - {:aggregation agg-matches}))) + agg-matches))) + +(defn- get-aggregation-description + [metadata query] + (when-let [details (get-aggregation-details metadata query)] + {:aggregation details})) (defn- get-breakout-description [metadata query] diff --git a/src/metabase/api/search.clj b/src/metabase/api/search.clj index 24aabbb2894ec2dcb53f31ad91fb3361fd6b91ca..83db2c8f06156111a4b77794e63aa7439ffb7a39 100644 --- a/src/metabase/api/search.clj +++ b/src/metabase/api/search.clj @@ -326,19 +326,22 @@ (let [base-query (base-query-for-model Table search-ctx)] (if (contains? current-user-perms "/") base-query - {:select (:select base-query) - :from [[(merge - base-query - {:select [:id :schema :db_id :name :description :display_name - [(hx/concat (hx/literal "/db/") :db_id (hx/literal "/") - (hsql/call :case [:not= :schema nil] :schema :else (hx/literal "")) (hx/literal "/") - :id (hx/literal "/")) - :path]]}) - :table]] - :where (cons - :or - (for [path current-user-perms] - [:like :path (str path "%")]))})))) + (let [data-perms (filter #(re-find #"^/db/*" %) current-user-perms)] + (when (seq data-perms) + {:select (:select base-query) + :from [[(merge + base-query + {:select [:id :schema :db_id :name :description :display_name + [(hx/concat (hx/literal "/db/") :db_id + (hx/literal "/schema/") (hsql/call :case + [:not= :schema nil] :schema + :else (hx/literal "")) + (hx/literal "/table/") :id + (hx/literal "/read/")) + :path]]}) + :table]] + :where (into [:or] (for [path data-perms] + [:like :path (str path "%")]))})))))) (defmulti ^:private check-permissions-for-model {:arglists '([search-result])} @@ -385,7 +388,7 @@ ;;; | Endpoint | ;;; +----------------------------------------------------------------------------------------------------------------+ -(s/defn ^:private make-search-context :- SearchContext +(s/defn ^:private search-context :- SearchContext [search-string :- (s/maybe su/NonBlankString), archived-string :- (s/maybe su/BooleanString)] {:search-string search-string :archived? (Boolean/parseBoolean archived-string) @@ -396,6 +399,6 @@ [q archived] {q (s/maybe su/NonBlankString) archived (s/maybe su/BooleanString)} - (search (make-search-context q archived))) + (search (search-context q archived))) (api/define-routes) diff --git a/src/metabase/api/session.clj b/src/metabase/api/session.clj index 24bcbdbdf67fbee5f2f5f170bee7cea667bfb84f..a24e2c9e540320938e704f1808d17c4c4f879860 100644 --- a/src/metabase/api/session.clj +++ b/src/metabase/api/session.clj @@ -66,6 +66,9 @@ (def ^:private password-fail-message (deferred-tru "Password did not match stored password.")) (def ^:private password-fail-snippet (deferred-tru "did not match stored password")) +(def ^:private disabled-account-message (deferred-tru "Your account is disabled. Please contact your administrator.")) +(def ^:private disabled-account-snippet (deferred-tru "Your account is disabled.")) + (s/defn ^:private ldap-login :- (s/maybe UUID) "If LDAP is enabled and a matching user exists return a new Session for them, or `nil` if they couldn't be authenticated." @@ -314,13 +317,20 @@ :email email}))] (create-session! :sso user))) -(defn- do-google-auth [{{:keys [token]} :body :as request}] +(defn do-google-auth + "Call to Google to perform an authentication" + [{{:keys [token]} :body :as request}] (let [token-info-response (http/post (format google-auth-token-info-url token)) {:keys [given_name family_name email]} (google-auth-token-info token-info-response)] (log/info (trs "Successfully authenticated Google Auth token for: {0} {1}" given_name family_name)) (let [session-id (api/check-500 (google-auth-fetch-or-create-user! given_name family_name email)) - response {:id session-id}] - (mw.session/set-session-cookie request response session-id)))) + response {:id session-id} + user (db/select-one [User :id :is_active], :email email)] + (if (and user (:is_active user)) + (mw.session/set-session-cookie request response session-id) + (throw (ex-info (str disabled-account-message) + {:status-code 400 + :errors {:account disabled-account-snippet}})))))) (api/defendpoint POST "/google_auth" "Login with Google Auth." diff --git a/src/metabase/integrations/ldap.clj b/src/metabase/integrations/ldap.clj index 392aea2facf37675917d4bed5015ecb1e45144f3..19544324ba51b1513ceb9f7b23b87e0301594d2d 100644 --- a/src/metabase/integrations/ldap.clj +++ b/src/metabase/integrations/ldap.clj @@ -95,7 +95,12 @@ (ldap-user-base))))) (defn- details->ldap-options [{:keys [host port bind-dn password security]}] - {:host (str host ":" port) + ;; Connecting via IPv6 requires us to use this form for :host, otherwise + ;; clj-ldap will find the first : and treat it as an IPv4 and port number + {:host {:address host + :port (if (string? port) + (Integer/parseInt port) + port)} :bind-dn bind-dn :password password :ssl? (= security "ssl") diff --git a/src/metabase/integrations/slack.clj b/src/metabase/integrations/slack.clj index 19676ef7a9e66a6403b4ba390a6b3aac8bf69065..f02ae9b70784eb3764507f77a33e58a713cbb747 100644 --- a/src/metabase/integrations/slack.clj +++ b/src/metabase/integrations/slack.clj @@ -54,8 +54,10 @@ request (merge-with merge {:query-params {:token token} :as :stream - :conn-timeout 1000 - :socket-timeout 1000} + ;; use a relatively long connection timeout (10 seconds) in cases where we're fetching big + ;; amounts of data -- see #11735 + :conn-timeout 10000 + :socket-timeout 10000} request)] (try (handle-response (request-fn url request)) @@ -78,15 +80,19 @@ (not-empty (get-in response [:response_metadata :next_cursor]))) (def ^:private max-list-results - "Absolute maximum number of results to fetch from Slack API list endpoints. To prevent unbounded pagination of results." - 1000) + "Absolute maximum number of results to fetch from Slack API list endpoints. To prevent unbounded pagination of + results. Don't set this too low -- some orgs have many thousands of channels (see #12978)" + 10000) (defn- paged-list-request "Make a GET request to a Slack API list `endpoint`, returning a sequence of objects returned by the top level `results-key` in the response. If additional pages of results exist, fetches those lazily, up to a total of `max-list-results`." [endpoint results-key params] - (let [response (m/mapply GET endpoint params)] + ;; use default limit (page size) of 1000 instead of 100 so we don't end up making a hundred API requests for orgs + ;; with a huge number of channels or users. + (let [default-params {:limit 1000} + response (m/mapply GET endpoint (merge default-params params))] (when (seq response) (take max-list-results @@ -109,7 +115,7 @@ (some (fn [channel] (when (= (:name channel) channel-name) channel)) - (conversations-list :exclude_archived false))) + (conversations-list))) (s/defn valid-token? "Check whether a Slack token is valid by checking whether we can call `conversations.list` with it." @@ -124,13 +130,17 @@ (defn users-list "Calls Slack API `users.list` endpoint and returns the list of available users." [& {:as query-parameters}] - (paged-list-request "users.list" :members query-parameters)) + (->> (paged-list-request "users.list" :members query-parameters) + ;; filter out deleted users and bots. At the time of this writing there's no way to do this in the Slack API + ;; itself so we need to do it after the fact. + (filter (complement :deleted)) + (filter (complement :is_bot)))) (defn- files-channel* [] (or (channel-with-name files-channel-name) (let [message (str (tru "Slack channel named `metabase_files` is missing!") " " - (tru "Please create the channel in order to complete the Slack integration.") + (tru "Please create or unarchive the channel in order to complete the Slack integration.") " " (tru "The channel is used for storing graphs that are included in Pulses and MetaBot answers."))] (log/error (u/format-color 'red message)) @@ -151,7 +161,7 @@ (def ^:private NonEmptyByteArray (s/constrained (Class/forName "[B") - #(pos? (count %)) + not-empty "Non-empty byte array")) (s/defn upload-file! diff --git a/src/metabase/public_settings.clj b/src/metabase/public_settings.clj index 9b99ec1a6f389bff983f6c4ca7523ffa82985308..bb4327544101b6293814f7da801534a29e9ee977 100644 --- a/src/metabase/public_settings.clj +++ b/src/metabase/public_settings.clj @@ -55,7 +55,8 @@ s (str "http://" s))] ;; check that the URL is valid - (assert (u/url? s) (tru "Invalid site URL: {0}" (pr-str s))) + (when-not (u/url? s) + (throw (ex-info (tru "Invalid site URL: {0}" (pr-str s)) {:url (pr-str s)}))) s)) (declare redirect-all-requests-to-https) @@ -68,7 +69,7 @@ :getter (fn [] (try (some-> (setting/get-string :site-url) normalize-site-url) - (catch AssertionError e + (catch clojure.lang.ExceptionInfo e (log/error e (trs "site-url is invalid; returning nil for now. Will be reset on next request."))))) :setter (fn [new-value] (let [new-value (some-> new-value normalize-site-url) diff --git a/src/metabase/query_processor.clj b/src/metabase/query_processor.clj index ce46e5afe24a80af341b980c9118bc444df2c37a..12204ff12eb3d4103ea18cbb6cea73fc5733dfbb 100644 --- a/src/metabase/query_processor.clj +++ b/src/metabase/query_processor.clj @@ -29,6 +29,7 @@ [expand-macros :as expand-macros] [fetch-source-query :as fetch-source-query] [format-rows :as format-rows] + [large-int-id :as large-int-id] [limit :as limit] [mbql-to-native :as mbql-to-native] [normalize-query :as normalize] @@ -68,6 +69,7 @@ #'cumulative-ags/handle-cumulative-aggregations #'resolve-joins/resolve-joins #'add-implicit-joins/add-implicit-joins + #'large-int-id/convert-id-to-string #'limit/limit #'format-rows/format-rows #'desugar/desugar diff --git a/src/metabase/query_processor/middleware/add_dimension_projections.clj b/src/metabase/query_processor/middleware/add_dimension_projections.clj index f949f92e524dfcabeae140922f9ae0239684c2b1..2895097e56ef191f5eeac60ce1474618ca473ed3 100644 --- a/src/metabase/query_processor/middleware/add_dimension_projections.clj +++ b/src/metabase/query_processor/middleware/add_dimension_projections.clj @@ -9,7 +9,7 @@ of values happens on the frontend, so this middleware simply adds the column to be used for replacement (e.g. `category.name`) to the `:fields` clause in pre-processing, so the Field will be fetched. Recall that Fields referenced via with `:fk->` clauses imply that JOINs will take place, which are automatically handled later in the - Query Processor pipeline. Additionally, this middleware will swap out and `:order-by` clauses referencing the + Query Processor pipeline. Additionally, this middleware will swap out `:breakout` and `:order-by` clauses referencing the original Field with ones referencing the remapped Field (for example, so we would sort by `category.name` instead of `category_id`). @@ -23,10 +23,13 @@ appropriate `:remapped_from` and `:remapped_to` attributes in the result `:cols` in post-processing. `:remapped_from` and `:remapped_to` are the names of the columns, e.g. `category_id` is `:remapped_to` `name`, and `name` is `:remapped_from` `:category_id`." - (:require [metabase.mbql + (:require [medley.core :as m] + [metabase.mbql [schema :as mbql.s] [util :as mbql.u]] - [metabase.models.dimension :refer [Dimension]] + [metabase.models + [dimension :refer [Dimension]] + [field :refer [Field]]] [metabase.util :as u] [metabase.util.schema :as su] [schema.core :as s] @@ -37,9 +40,11 @@ (def ^:private ExternalRemappingDimension "Schema for the info we fetch about `external` type Dimensions that will be used for remappings in this Query. Fetched by the pre-processing portion of the middleware, and passed along to the post-processing portion." - {:name su/NonBlankString ; display name for the remapping - :field_id su/IntGreaterThanZero ; ID of the Field being remapped - :human_readable_field_id su/IntGreaterThanZero}) ; ID of the FK Field to remap values to + {:name su/NonBlankString ; display name for the remapping + :field_id su/IntGreaterThanZero ; ID of the Field being remapped + :human_readable_field_id su/IntGreaterThanZero ; ID of the FK Field to remap values to + (s/optional-key :field_name) su/NonBlankString ; Name of the Field being remapped + (s/optional-key :human_readable_field_name) su/NonBlankString}) ; Name of the FK field to remap values to ;;; ----------------------------------------- add-fk-remaps (pre-processing) ----------------------------------------- @@ -62,16 +67,20 @@ get hidden when displayed anyway?)" [fields :- [mbql.s/Field]] (when-let [field-id->remapping-dimension (fields->field-id->remapping-dimension fields)] - (vec - (mbql.u/match fields - ;; don't match Field IDs nested in other clauses - [(_ :guard keyword?) [:field-id _] & _] nil + ;; Reconstruct how we uniquify names in `metabase.query-processor.middleware.annotate` + (let [unique-name (comp (mbql.u/unique-name-generator) :name Field)] + (vec + (mbql.u/match fields + ;; don't match Field IDs nested in other clauses + [(_ :guard keyword?) [:field-id _] & _] nil - [:field-id (id :guard field-id->remapping-dimension)] - (let [dimension (field-id->remapping-dimension id)] - [&match - [:fk-> &match [:field-id (:human_readable_field_id dimension)]] - dimension]))))) + [:field-id (id :guard field-id->remapping-dimension)] + (let [dimension (field-id->remapping-dimension id)] + [&match + [:fk-> &match [:field-id (:human_readable_field_id dimension)]] + (assoc dimension + :field_name (-> dimension :field_id unique-name) + :human_readable_field_name (-> dimension :human_readable_field_id unique-name))])))))) (s/defn ^:private update-remapped-order-by :- [mbql.s/OrderBy] "Order by clauses that include an external remapped column should be replace that original column in the order by with @@ -84,26 +93,41 @@ [direction remapped-col] order-by-clause)))) +(defn- update-remapped-breakout + [field->remapped-col breakout-clause] + (vec (mapcat (fn [field] + (if-let [remapped-col (get field->remapped-col field)] + [remapped-col field] + [field])) + breakout-clause))) + (s/defn ^:private add-fk-remaps :- [(s/one (s/maybe [ExternalRemappingDimension]) "external remapping dimensions") (s/one mbql.s/Query "query")] "Add any Fields needed for `:external` remappings to the `:fields` clause of the query, and update `:order-by` - clause as needed. Returns a pair like `[external-remapping-dimensions updated-query]`." - [{{:keys [fields order-by]} :query, :as query} :- mbql.s/Query] - ;; TODO - I think we need to handle Fields in `:breakout` here as well... - ;; fetch remapping column pairs if any exist... - (if-let [remap-col-tuples (seq (create-remap-col-tuples fields))] - ;; if they do, update `:fields` and `:order-by` clauses accordingly and add to the query - (let [new-fields (vec (concat fields (map second remap-col-tuples))) - ;; make a map of field-id-clause -> fk-clause from the tuples - new-order-by (update-remapped-order-by (into {} (for [[field-clause fk-clause] remap-col-tuples] - [field-clause fk-clause])) - order-by)] - ;; return the Dimensions we are using and the query - [(map last remap-col-tuples) - (cond-> (assoc-in query [:query :fields] new-fields) - (seq new-order-by) (assoc-in [:query :order-by] new-order-by))]) - ;; otherwise return query as-is - [nil query])) + and `breakout` clauses as needed. Returns a pair like `[external-remapping-dimensions updated-query]`." + [{{:keys [fields order-by breakout source-query]} :query, :as query} :- mbql.s/Query] + (let [[source-query-remappings query] + (if (and source-query (not (:native source-query))) ;; Only do lifting if source is MBQL query + (let [[source-query-remappings source-query] (add-fk-remaps (assoc query :query source-query))] + [source-query-remappings (assoc-in query [:query :source-query] (:query source-query))]) + [nil query])] + ;; fetch remapping column pairs if any exist... + (if-let [remap-col-tuples (seq (create-remap-col-tuples (concat fields breakout)))] + ;; if they do, update `:fields`, `:order-by` and `:breakout` clauses accordingly and add to the query + (let [new-fields (into fields (map second) remap-col-tuples) + ;; make a map of field-id-clause -> fk-clause from the tuples + field->remapped-col (into {} (for [[field-clause fk-clause] remap-col-tuples] + [field-clause fk-clause])) + new-breakout (update-remapped-breakout field->remapped-col breakout) + new-order-by (update-remapped-order-by field->remapped-col order-by)] + ;; return the Dimensions we are using and the query + [(concat source-query-remappings (map last remap-col-tuples)) + (cond-> query + (seq fields) (assoc-in [:query :fields] new-fields) + (seq order-by) (assoc-in [:query :order-by] new-order-by) + (seq breakout) (assoc-in [:query :breakout] new-breakout))]) + ;; otherwise return query as-is + [source-query-remappings query]))) ;;; ---------------------------------------- remap-results (post-processing) ----------------------------------------- @@ -116,28 +140,43 @@ [columns :- [su/Map] remapping-dimensions :- (s/maybe [ExternalRemappingDimension]) internal-remap-columns :- (s/maybe [su/Map])] - (let [column-id->column (u/key-by :id columns) + ;; We have to complicate our lives a bit and account for the possibility that dimensions might be + ;; used in an upstream `source-query`. If so, `columns` will treat them as `:field-value`s, erasing + ;; IDs. In that case reconstruct the mappings using names. + ;; + ;; TODO: + ;; Matching by name is brittle and might produce wrong results when there are name clashes + ;; in the source fields. + (let [column-id->column (u/key-by (some-fn :id :name) columns) name->internal-remapped-to-col (u/key-by :remapped_from internal-remap-columns) - id->remapped-to-dimension (u/key-by :field_id remapping-dimensions) - id->remapped-from-dimension (u/key-by :human_readable_field_id remapping-dimensions)] + id->remapped-to-dimension (merge (u/key-by :field_id remapping-dimensions) + (u/key-by :field_name remapping-dimensions)) + id->remapped-from-dimension (merge (u/key-by :human_readable_field_id remapping-dimensions) + (u/key-by :human_readable_field_name remapping-dimensions)) + get-first-key (fn [m & ks] + (some-> (m/find-first m ks) m))] (for [{:keys [id], column-name :name, :as column} columns] (merge {:base_type :type/*} column ;; if one of the internal remapped columns says it's remapped from this column, add a matching `:remapped_to` ;; entry - (when-let [{remapped-to-name :name} (get name->internal-remapped-to-col column-name)] + (when-let [{remapped-to-name :name} (name->internal-remapped-to-col column-name)] {:remapped_to remapped-to-name}) ;; if the pre-processing remapping Dimension info contains an entry where this Field's ID is `:field_id`, add ;; an entry noting the name of the Field it gets remapped to - (when-let [{remapped-to-id :human_readable_field_id} (get id->remapped-to-dimension id)] - {:remapped_to (:name (get column-id->column remapped-to-id))}) + (when-let [{remapped-to-id :human_readable_field_id + remapped-to-name :human_readable_field_name} + (id->remapped-to-dimension (or id column-name))] + {:remapped_to (:name (get-first-key column-id->column remapped-to-id remapped-to-name))}) ;; if the pre-processing remapping Dimension info contains an entry where this Field's ID is ;; `:human_readable_field_id`, add an entry noting the name of the Field it gets remapped from, and use the ;; `:display_name` of the Dimension - (when-let [{dimension-name :name, remapped-from-id :field_id} (get id->remapped-from-dimension id)] + (when-let [{dimension-name :name + remapped-from-id :field_id + remapped-from-name :field_name} (id->remapped-from-dimension (or id column-name))] {:display_name dimension-name - :remapped_from (:name (get column-id->column remapped-from-id))}))))) + :remapped_from (:name (get-first-key column-id->column remapped-from-id remapped-from-name))}))))) (defn- create-remapped-col [col-name remapped-from base-type] {:description nil @@ -241,7 +280,7 @@ added and each row flowing through needs to include the remapped data for the new column. For external remappings, the column information needs to be updated with what it's being remapped from and the user specified name for the remapped column." - [{:keys [cols], :as metadata} {:keys [internal-only-dims]} rf] + [{:keys [internal-only-dims]} rf] (if-let [remap-fn (make-row-map-fn internal-only-dims)] (fn ([] @@ -258,7 +297,7 @@ (fn [metadata] (let [internal-cols-info (internal-columns-info (:cols metadata)) metadata (add-remapped-cols metadata remapping-dimensions internal-cols-info)] - (remap-results-xform metadata internal-cols-info (rff metadata))))) + (remap-results-xform internal-cols-info (rff metadata))))) (defn add-remapping "Query processor middleware. `qp` is the query processor, returns a function that works on a `query` map. Delgates to diff --git a/src/metabase/query_processor/middleware/large_int_id.clj b/src/metabase/query_processor/middleware/large_int_id.clj new file mode 100644 index 0000000000000000000000000000000000000000..e4102a95a0dc1ebc894405dd29954158f676ddc0 --- /dev/null +++ b/src/metabase/query_processor/middleware/large_int_id.clj @@ -0,0 +1,47 @@ +(ns metabase.query-processor.middleware.large-int-id + "Middleware for handling conversion of IDs to strings for proper display of large numbers" + (:require [metabase.mbql.util :as mbql.u] + [metabase.models.field :refer [Field]])) + +(defn- result-int->string + [field-indexes rf] + (fn + ([] (rf)) + ([result] (rf result)) + ([result row] + (rf result (reduce #(update-in %1 [%2] str) row field-indexes))))) + +(defn convert-id-to-string + "Converts any ID (:type/PK and :type/FK) in a result to a string to handle a number > 2^51 + or < -2^51, the JavaScript float mantissa. This will allow proper display of large numbers, + like IDs from services like social media. All ID numbers are converted to avoid the performance + penalty of a comparison based on size." + [qp] + (fn [{{:keys [js-int-to-string?] :or {js-int-to-string? false}} :middleware, :as query} rff context] + ;; currently, this excludes `:field-literal` values like aggregations. + ;; + ;; for a query like below, *no* conversion will occur + ;; (mt/mbql-query venues + ;; {:source-query {:source-table $$venues + ;; :aggregation [[:aggregation-options + ;; [:avg $id] + ;; {:name "some_generated_name", :display-name "My Cool Ag"}]] + ;; :breakout [$price]}}) + ;; when you run in this fashion, you lose the ability to determine if it's an ID - you get a `:fields` value + ;; like: `:fields [[:field-literal "PRICE" :type/Integer] [:field-literal "some_generated_name" :type/BigInteger]]` + ;; so, short of turning all `:type/Integer` derived values into strings, this is the best approximation + ;; of a fix that can be accomplished. + (let [field-indexes (keep-indexed + (fn [idx val] + (let [field-id (mbql.u/field-clause->id-or-literal val)] + (when-let [field (when (number? field-id) + (Field field-id))] + (when (and (or (isa? (:special_type field) :type/PK) + (isa? (:special_type field) :type/FK)) + (isa? (:base_type field) :type/Integer)) + idx)))) + (:fields (:query query)))] + (qp query (if (and js-int-to-string? (seq field-indexes)) + #(result-int->string field-indexes (rff %)) + rff) + context)))) diff --git a/src/metabase/query_processor/streaming/xlsx.clj b/src/metabase/query_processor/streaming/xlsx.clj index 1ca44891739b548814d23f5213b5369722914579..7b6fc7d98840723b825e091ff92715f7a1e668e6 100644 --- a/src/metabase/query_processor/streaming/xlsx.clj +++ b/src/metabase/query_processor/streaming/xlsx.clj @@ -2,14 +2,12 @@ (:require [cheshire.core :as json] [dk.ative.docjure.spreadsheet :as spreadsheet] [java-time :as t] - [metabase.query-processor.streaming - [common :as common] - [interface :as i]] + [metabase.query-processor.streaming.interface :as i] [metabase.util [date-2 :as u.date] [i18n :refer [tru]]]) (:import java.io.OutputStream - org.apache.poi.ss.usermodel.Cell + [org.apache.poi.ss.usermodel Cell CellType] org.apache.poi.xssf.streaming.SXSSFWorkbook)) (defmethod i/stream-options :xlsx @@ -19,12 +17,45 @@ :headers {"Content-Disposition" (format "attachment; filename=\"query_result_%s.xlsx\"" (u.date/format (t/zoned-date-time)))}}) +(defprotocol ExcelFormatValue + "Protocol for specifying how objects of various classes in QP result rows should be formatted for Excel + exports. The generic version in `common/FormatValue` does not work for Excel due to date/time formats." + (excel-format-value [this] + "Format this value in a QP result row appropriately for a results download to Excel.")) + +;; this version of the protocol resolves #10803 by not converting datetimes to ISO8601 (which Excel cannot parse) +;; Docjure can handle java.util.Date instances, but not the new Java 8 Time instances, so we convert everything +;; to java.util.Date. +(extend-protocol ExcelFormatValue + nil + (excel-format-value [_] nil) + + Object + (excel-format-value [this] this) + + java.time.temporal.Temporal + (excel-format-value [this] + (t/java-date this)) + + java.time.LocalDateTime + (excel-format-value [this] + ;; this is a sticky one - LocalDateTime where? Using the server's timezone for the conversion. + (t/java-date (t/zoned-date-time this (t/zone-id)))) + + java.time.OffsetDateTime + (excel-format-value [this] + (t/java-date this)) + + java.time.ZonedDateTime + (excel-format-value [this] + (t/java-date this))) + ;; add a generic implementation for the method that writes values to XLSX cells that just piggybacks off the ;; implementations we've already defined for encoding things as JSON. These implementations live in ;; `metabase.middleware`. (defmethod spreadsheet/set-cell! Object [^Cell cell, value] - (when (= (.getCellType cell) Cell/CELL_TYPE_FORMULA) - (.setCellType cell Cell/CELL_TYPE_STRING)) + (when (= (.getCellType cell) CellType/FORMULA) + (.setCellType cell CellType/STRING)) ;; stick the object in a JSON map and encode it, which will force conversion to a string. Then unparse that JSON and ;; use the resulting value as the cell's new String value. There might be some more efficient way of doing this but ;; I'm not sure what it is. @@ -42,7 +73,7 @@ (spreadsheet/add-row! sheet (map (some-fn :display_name :name) cols))) (write-row! [_ row _] - (spreadsheet/add-row! sheet (map common/format-value row))) + (spreadsheet/add-row! sheet (map excel-format-value row))) (finish! [_ _] (spreadsheet/save-workbook-into-stream! os workbook) diff --git a/src/metabase/task/sync_databases.clj b/src/metabase/task/sync_databases.clj index a35d697d2ef7143748c49891a19906995be42ea8..6d69111ca9528fb58cffc1d43be01adcc472b403 100644 --- a/src/metabase/task/sync_databases.clj +++ b/src/metabase/task/sync_databases.clj @@ -9,7 +9,7 @@ [metabase [task :as task] [util :as u]] - [metabase.models.database :refer [Database]] + [metabase.models.database :as database :refer [Database]] [metabase.sync [analyze :as analyze] [field-values :as field-values] @@ -27,6 +27,8 @@ ;;; | JOB LOGIC | ;;; +----------------------------------------------------------------------------------------------------------------+ +(declare unschedule-tasks-for-db!) + (s/defn ^:private job-context->database-id :- (s/maybe su/IntGreaterThanZero) "Get the Database ID referred to in `job-context`." [job-context] @@ -34,25 +36,39 @@ ;; The DisallowConcurrentExecution on the two defrecords below attaches an annotation to the generated class that will ;; constrain the job execution to only be one at a time. Other triggers wanting the job to run will misfire. -(jobs/defjob ^{org.quartz.DisallowConcurrentExecution true} SyncAndAnalyzeDatabase [job-context] + +(defn sync-and-analyze-database + "The sync and analyze database job, as a function that can be used in a test" + [job-context] (when-let [database-id (job-context->database-id job-context)] (log/info (trs "Starting sync task for Database {0}." database-id)) (when-let [database (or (Database database-id) - (log/warn (trs "Cannot sync Database {0}: Database does not exist." database-id)))] + (do + (unschedule-tasks-for-db! (database/map->DatabaseInstance {:id database-id})) + (log/warn (trs "Cannot sync Database {0}: Database does not exist." database-id))))] (sync-metadata/sync-db-metadata! database) ;; only run analysis if this is a "full sync" database (when (:is_full_sync database) - (analyze/analyze-db! database))))) + (analyze/analyze-db! database)))) ) -(jobs/defjob ^{org.quartz.DisallowConcurrentExecution true} UpdateFieldValues [job-context] +(jobs/defjob ^{org.quartz.DisallowConcurrentExecution true} SyncAndAnalyzeDatabase [job-context] + (sync-and-analyze-database job-context)) + +(defn update-field-values + "The update field values job, as a function that can be used in a test" + [job-context] (when-let [database-id (job-context->database-id job-context)] (log/info (trs "Update Field values task triggered for Database {0}." database-id)) (when-let [database (or (Database database-id) - (log/warn "Cannot update Field values for Database {0}: Database does not exist." database-id))] + (do + (unschedule-tasks-for-db! (database/map->DatabaseInstance {:id database-id})) + (log/warn "Cannot update Field values for Database {0}: Database does not exist." database-id)))] (if (:is_full_sync database) (field-values/update-field-values! database) (log/info (trs "Skipping update, automatic Field value updates are disabled for Database {0}." database-id)))))) +(jobs/defjob ^{org.quartz.DisallowConcurrentExecution true} UpdateFieldValues [job-context] + (update-field-values job-context)) ;;; +----------------------------------------------------------------------------------------------------------------+ ;;; | TASK INFO AND GETTER FUNCTIONS | diff --git a/test/metabase/api/dataset_test.clj b/test/metabase/api/dataset_test.clj index 9e6de7aebd4d17c6b4009d466b5302e26500f278..52f5324751d7a1496cd0b2a5dbc4091499a41abf 100644 --- a/test/metabase/api/dataset_test.clj +++ b/test/metabase/api/dataset_test.clj @@ -56,7 +56,8 @@ (db/select-one QueryExecution {:order-by [[:started_at :desc]]})) (def ^:private query-defaults - {:middleware {:add-default-userland-constraints? true}}) + {:middleware {:add-default-userland-constraints? true + :js-int-to-string? true}}) (deftest basic-query-test (testing "POST /api/dataset" diff --git a/test/metabase/api/query_description_test.clj b/test/metabase/api/query_description_test.clj index ffb603bb1a4a36e7af3ee09fa12ba7efe47f83bf..04d9a7a791ae0619b48e453da366541fd1618dbb 100644 --- a/test/metabase/api/query_description_test.clj +++ b/test/metabase/api/query_description_test.clj @@ -31,7 +31,7 @@ (testing "with cumulative sum of price" (is (= {:table "Venues" :aggregation [{:type :cum-sum - :arg ["Price"]}]} + :arg "Price"}]} (sut/generate-query-description (Table (mt/id :venues)) (:query (mt/mbql-query :venues {:aggregation [[:cum-sum $price]]})))))) @@ -88,4 +88,21 @@ [:sum [:* [:field-id (mt/id :venues :latitude)] [:field-id (mt/id :venues :longitude)]]] - {:display-name "Nonsensical named metric"}]]}))))))) + {:display-name "Nonsensical named metric"}]]})))) + + (testing "with unnamed complex aggregation" + (is (= {:table "Venues" + :aggregation [{:type :sum :arg ["Latitude" "*" "Longitude"]}]} + (sut/generate-query-description (Table (mt/id :venues)) + {:aggregation [[:sum [:* + [:field-id (mt/id :venues :latitude)] + [:field-id (mt/id :venues :longitude)]]]]})))) + + (testing "with unnamed complex aggregation with multiple arguments" + (is (= {:table "Venues" + :aggregation [{:type :sum :arg ["Latitude" "+" "Longitude" "+" "ID"]}]} + (sut/generate-query-description (Table (mt/id :venues)) + {:aggregation [[:sum [:+ + [:field-id (mt/id :venues :latitude)] + [:field-id (mt/id :venues :longitude)] + [:field-id (mt/id :venues :id)]]]]}))))))) diff --git a/test/metabase/api/search_test.clj b/test/metabase/api/search_test.clj index 90821191d4775d4467badf3b013317d8dfecd1e7..d0b76a0a84bda0c655eb1cc0053430a17b8bc807 100644 --- a/test/metabase/api/search_test.clj +++ b/test/metabase/api/search_test.clj @@ -12,7 +12,6 @@ [metabase.models [permissions :as perms] [permissions-group :as group]] - [metabase.test.data.users :as test-users] [toucan.db :as db])) (def ^:private default-search-row @@ -110,16 +109,27 @@ (defmacro ^:private with-search-items-in-collection [created-items-sym search-string & body] `(do-with-search-items ~search-string false (fn [~created-items-sym] ~@body))) +(def ^:private ^:dynamic *search-request-results-database-id* + "Filter out all results from `search-request` that don't have this Database ID. Default: the default H2 `test-data` + Database. Other results are filtered out so these tests can be ran from the REPL without the presence of other + Databases causing the tests to fail." + mt/id) + (defn- search-request [user-kwd & params] - (vec - (sorted-results - (let [raw-results (apply (test-users/user->client user-kwd) :get 200 "search" params)] - (for [result raw-results - ;; filter out any results not from the usual test data DB (e.g. results from other drivers) - :when (contains? #{(mt/id) nil} (:database_id result))] - (-> result - mt/boolean-ids-and-timestamps - (update :collection_name #(some-> % string?)))))))) + (let [raw-results (apply (mt/user->client user-kwd) :get 200 "search" params) + keep-database-id (if (fn? *search-request-results-database-id*) + (*search-request-results-database-id*) + *search-request-results-database-id*)] + (if (:error raw-results) + raw-results + (vec + (sorted-results + (for [result raw-results + ;; filter out any results not from the usual test data DB (e.g. results from other drivers) + :when (contains? #{keep-database-id nil} (:database_id result))] + (-> result + mt/boolean-ids-and-timestamps + (update :collection_name #(some-> % string?))))))))) (deftest basic-test (testing "Basic search, should find 1 of each entity type, all items in the root collection" @@ -153,7 +163,7 @@ (mt/with-non-admin-groups-no-root-collection-perms (with-search-items-in-root-collection "test" (mt/with-temp* [PermissionsGroup [group] - PermissionsGroupMembership [_ {:user_id (test-users/user->id :rasta), :group_id (u/get-id group)}]] + PermissionsGroupMembership [_ {:user_id (mt/user->id :rasta), :group_id (u/get-id group)}]] (perms/grant-permissions! group (perms/collection-read-path {:metabase.models.collection.root/is-root? true})) (is (= (remove (comp #{"collection"} :model) (default-search-results)) (search-request :rasta :q "test"))))))) @@ -163,7 +173,7 @@ (with-search-items-in-collection {:keys [collection]} "test" (with-search-items-in-root-collection "test2" (mt/with-temp* [PermissionsGroup [group] - PermissionsGroupMembership [_ {:user_id (test-users/user->id :rasta), :group_id (u/get-id group)}]] + PermissionsGroupMembership [_ {:user_id (mt/user->id :rasta), :group_id (u/get-id group)}]] (perms/grant-collection-read-permissions! group (u/get-id collection)) (is (= (sorted-results (into @@ -179,7 +189,7 @@ (with-search-items-in-collection {:keys [collection]} "test" (with-search-items-in-root-collection "test2" (mt/with-temp* [PermissionsGroup [group] - PermissionsGroupMembership [_ {:user_id (test-users/user->id :rasta), :group_id (u/get-id group)}]] + PermissionsGroupMembership [_ {:user_id (mt/user->id :rasta), :group_id (u/get-id group)}]] (perms/grant-permissions! group (perms/collection-read-path {:metabase.models.collection.root/is-root? true})) (perms/grant-collection-read-permissions! group collection) (is (= (sorted-results @@ -194,7 +204,7 @@ (with-search-items-in-collection {coll-1 :collection} "test" (with-search-items-in-collection {coll-2 :collection} "test2" (mt/with-temp* [PermissionsGroup [group] - PermissionsGroupMembership [_ {:user_id (test-users/user->id :rasta), :group_id (u/get-id group)}]] + PermissionsGroupMembership [_ {:user_id (mt/user->id :rasta), :group_id (u/get-id group)}]] (perms/grant-collection-read-permissions! group (u/get-id coll-1)) (perms/grant-collection-read-permissions! group (u/get-id coll-2)) (is (= (sorted-results @@ -209,7 +219,7 @@ (with-search-items-in-collection {coll-1 :collection} "test" (with-search-items-in-collection {coll-2 :collection} "test2" (mt/with-temp* [PermissionsGroup [group] - PermissionsGroupMembership [_ {:user_id (test-users/user->id :rasta), :group_id (u/get-id group)}]] + PermissionsGroupMembership [_ {:user_id (mt/user->id :rasta), :group_id (u/get-id group)}]] (perms/grant-collection-read-permissions! group (u/get-id coll-1)) (is (= (sorted-results (into @@ -243,18 +253,18 @@ (testing "Favorites are per user, so other user's favorites don't cause search results to be favorited" (with-search-items-in-collection {:keys [card dashboard]} "test" (mt/with-temp* [CardFavorite [_ {:card_id (u/get-id card) - :owner_id (test-users/user->id :rasta)}] + :owner_id (mt/user->id :rasta)}] DashboardFavorite [_ {:dashboard_id (u/get-id dashboard) - :user_id (test-users/user->id :rasta)}]] + :user_id (mt/user->id :rasta)}]] (is (= (default-results-with-collection) (search-request :crowberto :q "test")))))) (testing "Basic search, should find 1 of each entity type and include favorites when available" (with-search-items-in-collection {:keys [card dashboard]} "test" (mt/with-temp* [CardFavorite [_ {:card_id (u/get-id card) - :owner_id (test-users/user->id :crowberto)}] + :owner_id (mt/user->id :crowberto)}] DashboardFavorite [_ {:dashboard_id (u/get-id dashboard) - :user_id (test-users/user->id :crowberto)}]] + :user_id (mt/user->id :crowberto)}]] (is (= (on-search-types #{"dashboard" "card"} #(assoc % :favorite true) (default-results-with-collection)) @@ -295,7 +305,7 @@ (filter (fn [{:keys [model id]}] (and (= id (u/get-id pulse)) (= "pulse" model))) - ((test-users/user->client :crowberto) :get 200 "search")))))))) + ((mt/user->client :crowberto) :get 200 "search")))))))) (defn- default-table-search-row [table-name] (merge @@ -308,22 +318,28 @@ :model "table" :database_id true})) +(defmacro ^:private do-test-users {:style/indent 1} [[user-binding users] & body] + `(doseq [user# ~users + :let [~user-binding user#]] + (testing (format "\nuser = %s" user#) + ~@body))) + (deftest table-test - (testing "You should see Tables in the search results!" + (testing "You should see Tables in the search results!\n" (mt/with-temp Table [table {:name "Round Table"}] - (doseq [user [:crowberto :rasta]] + (do-test-users [user [:crowberto :rasta]] (is (= [(default-table-search-row "Round Table")] (search-request user :q "Round Table"))))) (testing "When searching with ?archived=true, normal Tables should not show up in the results" (let [table-name (mt/random-name)] (mt/with-temp Table [table {:name table-name}] - (doseq [user [:crowberto :rasta]] + (do-test-users [user [:crowberto :rasta]] (is (= [] (search-request user :q table-name :archived true))))))) (testing "*archived* tables should not appear in search results" (let [table-name (mt/random-name)] (mt/with-temp Table [table {:name table-name, :active false}] - (doseq [user [:crowberto :rasta]] + (do-test-users [user [:crowberto :rasta]] (is (= [] (search-request user :q table-name))))))) (testing "you should not be able to see a Table if the current user doesn't have permissions for that Table" @@ -331,7 +347,22 @@ Table [table {:db_id db-id}]] (perms/revoke-permissions! (group/all-users) db-id) (is (= [] - (search-request :rasta :q (:name table)))))))) + (binding [*search-request-results-database-id* db-id] + (search-request :rasta :q (:name table))))))))) + +(deftest all-users-no-perms-table-test + (testing (str "If the All Users group doesn't have perms to view a Table, but the current User is in a group that " + "does have perms, they should still be able to see it (#12332)") + (mt/with-temp* [Database [{db-id :id}] + Table [table {:name "Round Table", :db_id db-id}] + PermissionsGroup [{group-id :id}] + PermissionsGroupMembership [_ {:group_id group-id, :user_id (mt/user->id :rasta)}]] + (perms/revoke-permissions! (group/all-users) db-id (:schema table) (:id table)) + (perms/grant-permissions! group-id (perms/table-read-path table)) + (do-test-users [user [:crowberto :rasta]] + (is (= [(default-table-search-row "Round Table")] + (binding [*search-request-results-database-id* db-id] + (search-request user :q "Round Table")))))))) (deftest collection-namespaces-test (testing "Search should only return Collections in the 'default' namespace" diff --git a/test/metabase/api/session_test.clj b/test/metabase/api/session_test.clj index f154207bd21d1d7081728a1ae1839c1220372e3b..4ea956c38c0c0124e9af79ab0bbc415ee8d937aa 100644 --- a/test/metabase/api/session_test.clj +++ b/test/metabase/api/session_test.clj @@ -20,7 +20,8 @@ [metabase.test.integrations.ldap :as ldap.test] [schema.core :as s] [toucan.db :as db]) - (:import java.util.UUID)) + (:import clojure.lang.ExceptionInfo + java.util.UUID)) ;; one of the tests below compares the way properties for the H2 driver are translated, so we need to make sure it's ;; loaded @@ -406,6 +407,32 @@ token-2)} token-1))))))) +(deftest google-auth-tests + (mt/with-temporary-setting-values [google-auth-client-id "PRETEND-GOOD-GOOGLE-CLIENT-ID"] + (testing "with an active account" + (mt/with-temp User [user {:email "test@metabase.com" :is_active true}] + (with-redefs [http/post (fn [url] {:status 200 + :body (str "{\"aud\":\"PRETEND-GOOD-GOOGLE-CLIENT-ID\"," + "\"email_verified\":\"true\"," + "\"first_name\":\"test\"," + "\"last_name\":\"user\"," + "\"email\":\"test@metabase.com\"}")})] + + (let [result (session-api/do-google-auth {:body {:token "foo"}})] + (is (= 200 (:status result))))))) + (testing "with a disabled account" + (mt/with-temp User [user {:email "test@metabase.com" :is_active false}] + (with-redefs [http/post (fn [url] {:status 200 + :body (str "{\"aud\":\"PRETEND-GOOD-GOOGLE-CLIENT-ID\"," + "\"email_verified\":\"true\"," + "\"first_name\":\"test\"," + "\"last_name\":\"user\"," + "\"email\":\"test@metabase.com\"}")})] + (is (thrown-with-msg? + ExceptionInfo + #"Your account is disabled. Please contact your administrator." + (session-api/do-google-auth {:body {:token "foo"}})))))))) + ;;; --------------------------------------- google-auth-fetch-or-create-user! ---------------------------------------- (deftest google-auth-fetch-or-create-user!-test diff --git a/test/metabase/driver/mysql_test.clj b/test/metabase/driver/mysql_test.clj index 9e3638a1217dda56cb892b48acd7d0fed9c7e0fa..405b109eee835b00b59c9885a094e25105c1da2a 100644 --- a/test/metabase/driver/mysql_test.clj +++ b/test/metabase/driver/mysql_test.clj @@ -12,39 +12,41 @@ [metabase.driver.sql-jdbc.connection :as sql-jdbc.conn] [metabase.models [database :refer [Database]] - [field :refer [Field]]] + [field :refer [Field]] + [table :refer [Table]]] [metabase.test.data.interface :as tx] - [toucan.db :as db] + [toucan + [db :as db] + [hydrate :refer [hydrate]]] [toucan.util.test :as tt])) (deftest all-zero-dates-test (mt/test-driver :mysql (testing (str "MySQL allows 0000-00-00 dates, but JDBC does not; make sure that MySQL is converting them to NULL " - "when returning them like we asked")) - (let [spec (sql-jdbc.conn/connection-details->spec :mysql (tx/dbdef->connection-details :mysql :server nil))] - (try - ;; Create the DB - (doseq [sql ["DROP DATABASE IF EXISTS all_zero_dates;" - "CREATE DATABASE all_zero_dates;"]] - (jdbc/execute! spec [sql])) - ;; Create Table & add data - (let [details (tx/dbdef->connection-details :mysql :db {:database-name "all_zero_dates"}) - spec (-> (sql-jdbc.conn/connection-details->spec :mysql details) - ;; allow inserting dates where value is '0000-00-00' -- this is disallowed by default on newer - ;; versions of MySQL, but we still want to test that we can handle it correctly for older ones - (assoc :sessionVariables "sql_mode='ALLOW_INVALID_DATES'"))] - (doseq [sql ["CREATE TABLE `exciting-moments-in-history` (`id` integer, `moment` timestamp);" - "INSERT INTO `exciting-moments-in-history` (`id`, `moment`) VALUES (1, '0000-00-00');"]] + "when returning them like we asked") + (let [spec (sql-jdbc.conn/connection-details->spec :mysql (tx/dbdef->connection-details :mysql :server nil))] + (try + ;; Create the DB + (doseq [sql ["DROP DATABASE IF EXISTS all_zero_dates;" + "CREATE DATABASE all_zero_dates;"]] (jdbc/execute! spec [sql])) - ;; create & sync MB DB - (tt/with-temp Database [database {:engine "mysql", :details details}] - (sync/sync-database! database) - (mt/with-db database - ;; run the query - (is (= [[1 nil]] - (mt/rows - (mt/run-mbql-query exciting-moments-in-history))))))))))) - + ;; Create Table & add data + (let [details (tx/dbdef->connection-details :mysql :db {:database-name "all_zero_dates"}) + spec (-> (sql-jdbc.conn/connection-details->spec :mysql details) + ;; allow inserting dates where value is '0000-00-00' -- this is disallowed by default on newer + ;; versions of MySQL, but we still want to test that we can handle it correctly for older ones + (assoc :sessionVariables "sql_mode='ALLOW_INVALID_DATES'"))] + (doseq [sql ["CREATE TABLE `exciting-moments-in-history` (`id` integer, `moment` timestamp);" + "INSERT INTO `exciting-moments-in-history` (`id`, `moment`) VALUES (1, '0000-00-00');"]] + (jdbc/execute! spec [sql])) + ;; create & sync MB DB + (tt/with-temp Database [database {:engine "mysql", :details details}] + (sync/sync-database! database) + (mt/with-db database + ;; run the query + (is (= [[1 nil]] + (mt/rows + (mt/run-mbql-query exciting-moments-in-history)))))))))))) ;; Test how TINYINT(1) columns are interpreted. By default, they should be interpreted as integers, but with the ;; correct additional options, we should be able to change that -- see @@ -208,3 +210,51 @@ ;; disable the middleware that normally converts `LocalTime` to `Strings` so we can verify ;; our driver is actually doing the right thing :middleware {:format-rows? false})))))))))) + +(defn- table-fingerprint + [{:keys [fields name]}] + {:name name + :fields (map #(select-keys % [:name :base_type]) fields)}) + +(deftest system-versioned-tables-test + (mt/test-driver :mysql + (testing "system versioned tables appear during a sync" + (let [spec (sql-jdbc.conn/connection-details->spec :mysql (tx/dbdef->connection-details :mysql :server nil))] + (try + ;; Create the DB + (doseq [sql ["DROP DATABASE IF EXISTS versioned_tables;" + "CREATE DATABASE versioned_tables;"]] + (jdbc/execute! spec [sql])) + ;; Create Table & add data + (let [details (tx/dbdef->connection-details :mysql :db {:database-name "versioned_tables"}) + spec (sql-jdbc.conn/connection-details->spec :mysql details) + compat (try + (doseq [sql ["CREATE TABLE IF NOT EXISTS src1 (id INTEGER, t TEXT);" + "CREATE TABLE IF NOT EXISTS src2 (id INTEGER, t TEXT);" + "ALTER TABLE src2 ADD SYSTEM VERSIONING;" + "INSERT INTO src1 VALUES (1, '2020-03-01 12:20:35');" + "INSERT INTO src2 VALUES (1, '2020-03-01 12:20:35');"]] + (jdbc/execute! spec [sql])) + true + (catch java.sql.SQLSyntaxErrorException se + ;; if an error is received with SYSTEM VERSIONING mentioned, the version + ;; of mysql or mariadb being tested against does not support system versioning, + ;; so do not continue + (if (re-matches #".*VERSIONING'.*" (.getMessage se)) + false + (throw se))))] + (when compat + (tt/with-temp Database [database {:engine "mysql", :details details}] + (sync/sync-database! database) + (is (= [{:name "src1" + :fields [{:name "id" + :base_type :type/Integer} + {:name "t" + :base_type :type/Text}]} + {:name "src2" + :fields [{:name "id" + :base_type :type/Integer} + {:name "t" + :base_type :type/Text}]}] + (->> (hydrate (db/select Table :db_id (:id database) {:order-by [:name]}) :fields) + (map table-fingerprint)))))))))))) diff --git a/test/metabase/integrations/ldap_test.clj b/test/metabase/integrations/ldap_test.clj index 0ad3bc1a860f27955a91c9bb9d832adfc58c88f5..67cac8b8db7d0605fbdb60844e13971c0ef39183 100644 --- a/test/metabase/integrations/ldap_test.clj +++ b/test/metabase/integrations/ldap_test.clj @@ -1,5 +1,5 @@ (ns metabase.integrations.ldap-test - (:require [expectations :refer [expect]] + (:require [clojure.test :refer :all] [metabase.integrations.ldap :as ldap] [metabase.test.integrations.ldap :as ldap.test] [metabase.test.util :as tu])) @@ -16,109 +16,106 @@ ;; See test_resources/ldap.ldif for fixtures ;; The connection test should pass with valid settings -(expect - {:status :SUCCESS} - (ldap.test/with-ldap-server - (ldap/test-ldap-connection (get-ldap-details)))) - -;; The connection test should allow anonymous binds -(expect - {:status :SUCCESS} - (ldap.test/with-ldap-server - (ldap/test-ldap-connection (dissoc (get-ldap-details) :bind-dn)))) - -;; The connection test should fail with an invalid user search base -(expect - :ERROR - (ldap.test/with-ldap-server - (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :user-base "dc=example,dc=com"))))) - -;; The connection test should fail with an invalid group search base -(expect - :ERROR - (ldap.test/with-ldap-server - (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :group-base "dc=example,dc=com"))))) - -;; The connection test should fail with an invalid bind DN -(expect - :ERROR - (ldap.test/with-ldap-server - (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :bind-dn "cn=Not Directory Manager"))))) - -;; The connection test should fail with an invalid bind password -(expect - :ERROR - (ldap.test/with-ldap-server - (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :password "wrong"))))) - -;; Make sure the basic connection stuff works, this will throw otherwise -(expect - nil - (ldap.test/with-ldap-server - (.close (#'ldap/get-connection)))) - -;; Login with everything right should succeed -(expect - (ldap.test/with-ldap-server - (ldap/verify-password "cn=Directory Manager" "password"))) - -;; Login with wrong password should fail -(expect - false - (ldap.test/with-ldap-server - (ldap/verify-password "cn=Directory Manager" "wrongpassword"))) - -;; Login with invalid DN should fail -(expect - false - (ldap.test/with-ldap-server - (ldap/verify-password "cn=Nobody,ou=nowhere,dc=metabase,dc=com" "password"))) - -;; Login for regular users should also work -(expect - (ldap.test/with-ldap-server - (ldap/verify-password "cn=Sally Brown,ou=People,dc=metabase,dc=com" "1234"))) - -;; Login for regular users should also fail for the wrong password -(expect - false - (ldap.test/with-ldap-server - (ldap/verify-password "cn=Sally Brown,ou=People,dc=metabase,dc=com" "password"))) - -;; Find by username should work (given the default LDAP filter and test fixtures) -(expect - {:dn "cn=John Smith,ou=People,dc=metabase,dc=com" - :first-name "John" - :last-name "Smith" - :email "John.Smith@metabase.com" - :groups ["cn=Accounting,ou=Groups,dc=metabase,dc=com"]} - (ldap.test/with-ldap-server - (ldap/find-user "jsmith1"))) - -;; Find by email should also work (also given our default settings and fixtures) -(expect - {:dn "cn=John Smith,ou=People,dc=metabase,dc=com" - :first-name "John" - :last-name "Smith" - :email "John.Smith@metabase.com" - :groups ["cn=Accounting,ou=Groups,dc=metabase,dc=com"]} - (ldap.test/with-ldap-server - (ldap/find-user "John.Smith@metabase.com"))) - -;; Find by email should also work (also given our default settings and fixtures) -(expect - {:dn "cn=Fred Taylor,ou=People,dc=metabase,dc=com" - :first-name "Fred" - :last-name "Taylor" - :email "fred.taylor@metabase.com" - :groups []} - (ldap.test/with-ldap-server - (ldap/find-user "fred.taylor@metabase.com"))) - -;; LDAP group matching should identify Metabase groups using DN equality rules -(expect - #{1 2 3} - (tu/with-temporary-setting-values [ldap-group-mappings {"cn=accounting,ou=groups,dc=metabase,dc=com" [1 2] - "cn=shipping,ou=groups,dc=metabase,dc=com" [2 3]}] - (#'ldap/ldap-groups->mb-group-ids ["CN=Accounting,OU=Groups,DC=metabase,DC=com" - "CN=Shipping,OU=Groups,DC=metabase,DC=com"]))) +(deftest connection-test + (testing "anonymous binds" + (testing "successfully connect to IPv4 host" + (is (= {:status :SUCCESS} + (ldap.test/with-ldap-server + (ldap/test-ldap-connection (get-ldap-details))))))) + + (testing "invalid user search base" + (is (= :ERROR + (ldap.test/with-ldap-server + (:status (ldap/test-ldap-connection (assoc (get-ldap-details) + :user-base "dc=example,dc=com"))))))) + + (testing "invalid group search base" + (is (= :ERROR + (ldap.test/with-ldap-server + (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :group-base "dc=example,dc=com"))))))) + + (testing "invalid bind DN" + (is (= :ERROR + (ldap.test/with-ldap-server + (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :bind-dn "cn=Not Directory Manager"))))))) + + (testing "invalid bind password" + (is (= :ERROR + (ldap.test/with-ldap-server + (:status (ldap/test-ldap-connection (assoc (get-ldap-details) :password "wrong"))))))) + + (testing "basic get-connection works, will throw otherwise" + (is (= nil + (ldap.test/with-ldap-server + (.close (#'ldap/get-connection)))))) + + (testing "login should succeed" + (is (= true + (ldap.test/with-ldap-server + (ldap/verify-password "cn=Directory Manager" "password"))))) + + (testing "wrong password" + (is (= false + (ldap.test/with-ldap-server + (ldap/verify-password "cn=Directory Manager" "wrongpassword"))))) + + (testing "invalid DN fails" + (is (= false + (ldap.test/with-ldap-server + (ldap/verify-password "cn=Nobody,ou=nowhere,dc=metabase,dc=com" "password"))))) + + (testing "regular user login" + (is (= true + (ldap.test/with-ldap-server + (ldap/verify-password "cn=Sally Brown,ou=People,dc=metabase,dc=com" "1234"))))) + + (testing "fail regular user login with bad password" + (is (= false + (ldap.test/with-ldap-server + (ldap/verify-password "cn=Sally Brown,ou=People,dc=metabase,dc=com" "password"))))) + + (testing "find by username" + (is (= {:dn "cn=John Smith,ou=People,dc=metabase,dc=com" + :first-name "John" + :last-name "Smith" + :email "John.Smith@metabase.com" + :groups ["cn=Accounting,ou=Groups,dc=metabase,dc=com"]} + (ldap.test/with-ldap-server + (ldap/find-user "jsmith1"))))) + + (testing "find by email" + (is (= {:dn "cn=John Smith,ou=People,dc=metabase,dc=com" + :first-name "John" + :last-name "Smith" + :email "John.Smith@metabase.com" + :groups ["cn=Accounting,ou=Groups,dc=metabase,dc=com"]} + (ldap.test/with-ldap-server + (ldap/find-user "John.Smith@metabase.com"))))) + + (testing "find by email, no groups" + (is (= {:dn "cn=Fred Taylor,ou=People,dc=metabase,dc=com" + :first-name "Fred" + :last-name "Taylor" + :email "fred.taylor@metabase.com" + :groups []} + (ldap.test/with-ldap-server + (ldap/find-user "fred.taylor@metabase.com"))))) + + (testing "LDAP group matching should identify Metabase groups using DN equality rules" + (is (= #{1 2 3} + (tu/with-temporary-setting-values + [ldap-group-mappings {"cn=accounting,ou=groups,dc=metabase,dc=com" [1 2] + "cn=shipping,ou=groups,dc=metabase,dc=com" [2 3]}] + (#'ldap/ldap-groups->mb-group-ids ["CN=Accounting,OU=Groups,DC=metabase,DC=com" + "CN=Shipping,OU=Groups,DC=metabase,DC=com"])))))) + +;; For hosts that do not support IPv6, the connection code will return an error +;; This isn't a failure of the code, it's a failure of the host. +(deftest ipv6-test + (testing "successfully connect to IPv6 host" + (let [actual (ldap.test/with-ldap-server + (ldap/test-ldap-connection (assoc (get-ldap-details) + :host "[::1]")))] + (if (= (:status actual) :ERROR) + (is (re-matches #"An error occurred while attempting to connect to server \[::1].*" (:message actual))) + (is (= {:status :SUCCESS} actual)))))) diff --git a/test/metabase/integrations/slack_test.clj b/test/metabase/integrations/slack_test.clj index c66e12348216e410c37f2695dd302a7bc5d8ec17..48c8f78636bb6e04d220a8ae80043b0ca9e73e10 100644 --- a/test/metabase/integrations/slack_test.clj +++ b/test/metabase/integrations/slack_test.clj @@ -49,7 +49,7 @@ (testing "should return nil if no Slack token has been configured" (tu/with-temporary-setting-values [slack-token nil] (is (= nil - (thunk))))))) + (not-empty (thunk)))))))) (defn- test-invalid-auth-token "Test that a Slack API endpoint function throws an Exception if an invalid Slack API token is set." diff --git a/test/metabase/models/user_test.clj b/test/metabase/models/user_test.clj index 8af112e085fd0ce4fa8248c78d255b2985cb248f..9d640f09188e4faa1cdbc386f12113ce0a812952 100644 --- a/test/metabase/models/user_test.clj +++ b/test/metabase/models/user_test.clj @@ -4,17 +4,18 @@ [string :as str] [test :refer :all]] [metabase - [email-test :as email-test] [http-client :as http] [test :as mt] [util :as u]] [metabase.models [collection :as collection :refer [Collection]] [collection-test :as collection-test] + [database :refer [Database]] [permissions :as perms] [permissions-group :as group :refer [PermissionsGroup]] [permissions-group-membership :refer [PermissionsGroupMembership]] [session :refer [Session]] + [table :refer [Table]] [user :as user :refer [User]]] [metabase.test.data.users :as test-users] [metabase.util.password :as u.password] @@ -65,13 +66,25 @@ remove-non-collection-perms (collection-test/perms-path-ids->names [child-collection grandchild-collection]))))))))) +(deftest group-data-permissions-test + (testing "If a User is a member of a Group with data permissions for an object, `permissions-set` should return the perms" + (mt/with-temp* [Database [{db-id :id}] + Table [table {:name "Round Table", :db_id db-id}] + PermissionsGroup [{group-id :id}] + PermissionsGroupMembership [_ {:group_id group-id, :user_id (mt/user->id :rasta)}]] + (perms/revoke-permissions! (group/all-users) db-id (:schema table) (:id table)) + (perms/grant-permissions! group-id (perms/table-read-path table)) + (is (set/subset? + #{(perms/table-read-path table)} + (metabase.models.user/permissions-set (mt/user->id :rasta))))))) + ;;; Tests for invite-user and create-new-google-auth-user! (defn- maybe-accept-invite! "Accept an invite if applicable. Look in the body of the content of the invite email for the reset token since this is the only place to get it (the token stored in the DB is an encrypted hash)." [new-user-email-address] - (when-let [[{[{invite-email :content}] :body}] (get @email-test/inbox new-user-email-address)] + (when-let [[{[{invite-email :content}] :body}] (get @mt/inbox new-user-email-address)] (let [[_ reset-token] (re-find #"/auth/reset_password/(\d+_[\w_-]+)#new" invite-email)] (http/client :post 200 "session/reset_password" {:token reset-token :password "ABC123"})))) @@ -80,7 +93,7 @@ "Fetch the emails that have been sent in the form of a map of email address -> sequence of email subjects. For test-writing convenience the random email and names assigned to the new user are replaced with `<New User>`." [new-user-email-address new-user-first-name new-user-last-name] - (into {} (for [[address emails] @email-test/inbox + (into {} (for [[address emails] @mt/inbox :let [address (if (= address new-user-email-address) "<New User>" address)]] @@ -94,7 +107,7 @@ [& {:keys [google-auth? accept-invite? password invitor] :or {accept-invite? true}}] (mt/with-temporary-setting-values [site-name "Metabase"] - (email-test/with-fake-inbox + (mt/with-fake-inbox (let [new-user-email (mt/random-email) new-user-first-name (mt/random-name) new-user-last-name (mt/random-name) diff --git a/test/metabase/public_settings_test.clj b/test/metabase/public_settings_test.clj index d7b65ea7881d3832e5a2ba2bd86962fea8dd22a7..4a077bfc21c4515a776e9a8ed194ad22a684e14a 100644 --- a/test/metabase/public_settings_test.clj +++ b/test/metabase/public_settings_test.clj @@ -41,7 +41,7 @@ (testing "we should not be allowed to set an invalid `site-url` (#9850)" (mt/discard-setting-changes [site-url] (is (thrown? - AssertionError + clojure.lang.ExceptionInfo (public-settings/site-url "http://https://www.camsaul.com")))))) (deftest site-url-settings-set-valid-domain-name diff --git a/test/metabase/query_processor/middleware/add_dimension_projections_test.clj b/test/metabase/query_processor/middleware/add_dimension_projections_test.clj index 05827bbd94136ea04cedfeb01e0249b21ac326e4..dfb89cc90001294c08c1457c37a0c3a540744afd 100644 --- a/test/metabase/query_processor/middleware/add_dimension_projections_test.clj +++ b/test/metabase/query_processor/middleware/add_dimension_projections_test.clj @@ -1,5 +1,6 @@ (ns metabase.query-processor.middleware.add-dimension-projections-test (:require [clojure.test :refer :all] + [medley.core :as m] [metabase.query-processor.middleware.add-dimension-projections :as add-dim-projections] [metabase.test :as mt] [metabase.test.fixtures :as fixtures] @@ -10,44 +11,83 @@ ;;; ----------------------------------------- add-fk-remaps (pre-processing) ----------------------------------------- (def ^:private example-query - {:database 1 + {:database (mt/id) :type :query - :query {:source-table 1 - :fields [[:field-id 1] - [:field-id 2] - [:field-id 3]]}}) + :query {:source-table (mt/id :venues) + :fields [[:field-id (mt/id :venues :price)] + [:field-id (mt/id :venues :longitude)] + [:field-id (mt/id :venues :category_id)]]}}) + +(def ^:private remapped-field + {:name "Product" + :field_id (mt/id :venues :category_id) + :human_readable_field_id (mt/id :categories :name) + :field_name "CATEGORY_ID" + :human_readable_field_name "NAME"}) (defn- do-with-fake-remappings-for-field-3 [f] (with-redefs [add-dim-projections/fields->field-id->remapping-dimension (constantly - {3 {:name "Product", :field_id 3, :human_readable_field_id 4}})] + {(mt/id :venues :category_id) {:name "Product" + :field_id (mt/id :venues :category_id) + :human_readable_field_id (mt/id :categories :name)}})] (f))) (deftest create-remap-col-tuples (testing "make sure we create the remap column tuples correctly" (do-with-fake-remappings-for-field-3 (fn [] - (is (= [[[:field-id 3] - [:fk-> [:field-id 3] [:field-id 4]] - {:name "Product", :field_id 3, :human_readable_field_id 4}]] - (#'add-dim-projections/create-remap-col-tuples [[:field-id 1] [:field-id 2] [:field-id 3]]))))))) + (is (= [[[:field-id (mt/id :venues :category_id)] + [:fk-> [:field-id (mt/id :venues :category_id)] [:field-id (mt/id :categories :name)]] + remapped-field]] + (#'add-dim-projections/create-remap-col-tuples [[:field-id (mt/id :venues :price)] + [:field-id (mt/id :venues :longitude)] + [:field-id (mt/id :venues :category_id)]]))))))) (deftest add-fk-remaps-test (do-with-fake-remappings-for-field-3 (fn [] (testing "make sure FK remaps add an entry for the FK field to `:fields`, and returns a pair of [dimension-info updated-query]" - (is (= [[{:name "Product", :field_id 3, :human_readable_field_id 4}] + (is (= [[remapped-field] (update-in example-query [:query :fields] - conj [:fk-> [:field-id 3] [:field-id 4]])] + conj [:fk-> [:field-id (mt/id :venues :category_id)] [:field-id (mt/id :categories :name)]])] (#'add-dim-projections/add-fk-remaps example-query)))) (testing "adding FK remaps should replace any existing order-bys for a field with order bys for the FK remapping Field" - (is (= [[{:name "Product", :field_id 3, :human_readable_field_id 4}] + (is (= [[remapped-field] (-> example-query - (assoc-in [:query :order-by] [[:asc [:fk-> [:field-id 3] [:field-id 4]]]]) + (assoc-in [:query :order-by] + [[:asc [:fk-> [:field-id (mt/id :venues :category_id)] + [:field-id (mt/id :categories :name)]]]]) (update-in [:query :fields] - conj [:fk-> [:field-id 3] [:field-id 4]]))] - (#'add-dim-projections/add-fk-remaps (assoc-in example-query [:query :order-by] [[:asc [:field-id 3]]])))))))) + conj [:fk-> [:field-id (mt/id :venues :category_id)] + [:field-id (mt/id :categories :name)]]))] + (-> example-query + (assoc-in [:query :order-by] [[:asc [:field-id (mt/id :venues :category_id)]]]) + (#'add-dim-projections/add-fk-remaps))))) + + (testing "adding FK remaps should replace any existing breakouts for a field with order bys for the FK remapping Field" + (is (= [[remapped-field] + (-> example-query + (assoc-in [:query :aggregation] [[:count]]) + (assoc-in [:query :breakout] + [[:fk-> [:field-id (mt/id :venues :category_id)] + [:field-id (mt/id :categories :name)]] + [:field-id (mt/id :venues :category_id)]]) + (m/dissoc-in [:query :fields]))] + (-> example-query + (m/dissoc-in [:query :fields]) + (assoc-in [:query :aggregation] [[:count]]) + (assoc-in [:query :breakout] [[:field-id (mt/id :venues :category_id)]]) + (#'add-dim-projections/add-fk-remaps))))) + + (testing "make sure FK remaps work with nested queries" + (let [example-query (assoc example-query :query {:source-query (:query example-query)})] + (is (= [[remapped-field] + (update-in example-query [:query :source-query :fields] + conj [:fk-> [:field-id (mt/id :venues :category_id)] + [:field-id (mt/id :categories :name)]])] + (#'add-dim-projections/add-fk-remaps example-query)))))))) ;;; ---------------------------------------- remap-results (post-processing) ----------------------------------------- diff --git a/test/metabase/query_processor/middleware/large_int_id_test.clj b/test/metabase/query_processor/middleware/large_int_id_test.clj new file mode 100644 index 0000000000000000000000000000000000000000..671c6d80e05f1f333fd7a3868bbef295a010ccee --- /dev/null +++ b/test/metabase/query_processor/middleware/large_int_id_test.clj @@ -0,0 +1,97 @@ +(ns metabase.query-processor.middleware.large-int-id-test + (:require [clojure.test :refer :all] + [metabase + [query-processor :as qp] + [test :as mt]])) + +(deftest convert-ids + (let [query (mt/mbql-query users + {:order-by [[:asc $id]] + :limit 5})] + (testing "PKs become strings when middleware enabled" + (is (= [["1" "Plato Yeshua" "2014-04-01T08:30:00Z"] + ["2" "Felipinho Asklepios" "2014-12-05T15:15:00Z"] + ["3" "Kaneonuskatew Eiran" "2014-11-06T16:15:00Z"] + ["4" "Simcha Yan" "2014-01-01T08:30:00Z"] + ["5" "Quentin Sören" "2014-10-03T17:30:00Z"]] + (mt/rows + (qp/process-query (assoc query :middleware {:js-int-to-string? true})))))) + + (testing "PKs are left alone when middleware disabled (default)" + (is (= [[1 "Plato Yeshua" "2014-04-01T08:30:00Z"] + [2 "Felipinho Asklepios" "2014-12-05T15:15:00Z"] + [3 "Kaneonuskatew Eiran" "2014-11-06T16:15:00Z"] + [4 "Simcha Yan" "2014-01-01T08:30:00Z"] + [5 "Quentin Sören" "2014-10-03T17:30:00Z"]] + (mt/rows + (qp/process-query (assoc query :middleware {}))))))) + + (let [query (mt/mbql-query users + {:fields [$name] + :order-by [[:asc $name]] + :limit 5})] + (testing "handle when there are no ID columns in the query but the middleware is enabled" + (is (= [["Broen Olujimi"] + ["Conchúr Tihomir"] + ["Dwight Gresham"] + ["Felipinho Asklepios"] + ["Frans Hevel"]] + (mt/rows + (qp/process-query (assoc query :middleware {:js-int-to-string? true}))))))) + + (let [query (mt/mbql-query venues + {:order-by [[:asc $id]] + :limit 5})] + (testing "FKs become strings when middleware enabled" + (is (= [["1" "Red Medicine" "4" 10.0646 -165.374 3] + ["2" "Stout Burgers & Beers" "11" 34.0996 -118.329 2] + ["3" "The Apple Pan" "11" 34.0406 -118.428 2] + ["4" "Wurstküche" "29" 33.9997 -118.465 2] + ["5" "Brite Spot Family Restaurant" "20" 34.0778 -118.261 2]] + (mt/rows + (qp/process-query (assoc query :middleware {:js-int-to-string? true})))))) + + (testing "FKs are left alone when middleware disabled (default)" + (is (= [[1 "Red Medicine" 4 10.0646 -165.374 3] + [2 "Stout Burgers & Beers" 11 34.0996 -118.329 2] + [3 "The Apple Pan" 11 34.0406 -118.428 2] + [4 "Wurstküche" 29 33.9997 -118.465 2] + [5 "Brite Spot Family Restaurant" 20 34.0778 -118.261 2]] + (mt/rows + (qp/process-query (assoc query :middleware {}))))))) + + (let [query (mt/mbql-query checkins + {:fields [$id $user_id->users.id $user_id->users.name $venue_id->venues.id $venue_id->venues.name] + :order-by [[:asc $id]] + :limit 5})] + (testing "joins work correctly" + (is (= [["1" "5" "Quentin Sören" "12" "The Misfit Restaurant + Bar"] + ["2" "1" "Plato Yeshua" "31" "Bludso's BBQ"] + ["3" "8" "Szymon Theutrich" "56" "Philippe the Original"] + ["4" "5" "Quentin Sören" "4" "Wurstküche"] + ["5" "3" "Kaneonuskatew Eiran" "49" "Hotel Biron"]] + (mt/rows + (qp/process-query (assoc query :middleware {:js-int-to-string? true}))))))) + + (let [query (mt/mbql-query venues + {:source-query {:source-table $$venues + :aggregation [[:aggregation-options + [:avg $id] + {:name "some_generated_name", :display-name "My Cool Ag"}]] + :breakout [$price]}})] + ;; see comment in `metabase.query-processor.middleware.large-int-id/convert-id-to-string` + ;; for why this value does not change + (testing "aggregations are not converted to strings with middleware enabled" + (is (= [[1 55] + [2 48] + [3 47] + [4 62]] + (mt/rows + (qp/process-query (assoc query :middleware {:js-int-to-string? true}))))) ) + (testing "aggregation does not convert to strings with middleware disabled (default)" + (is (= [[1 55] + [2 48] + [3 47] + [4 62]] + (mt/rows + (qp/process-query (assoc query :middleware {})))))))) diff --git a/test/metabase/query_processor_test/remapping_test.clj b/test/metabase/query_processor_test/remapping_test.clj index b21584e2e1b7a88337549c4912486177978c7988..c1c7fc81f58a0490aa8486210bdb3249a1b7daba 100644 --- a/test/metabase/query_processor_test/remapping_test.clj +++ b/test/metabase/query_processor_test/remapping_test.clj @@ -28,7 +28,62 @@ (mt/run-mbql-query venues {:fields [$name $category_id] :order-by [[:asc $name]] - :limit 4})))))))) + :limit 4}))))))) + (mt/test-drivers (mt/normal-drivers-with-feature :foreign-keys) + (mt/with-temp-objects + (data/create-venue-category-fk-remapping! "Name") + (is (= {:rows [["American" 2 8] + ["Artisan" 3 2] + ["Asian" 4 2]] + :cols [(-> (qp.test/col :categories :name) + (assoc :remapped_from (mt/format-name "category_id")) + (assoc :field_ref [:fk-> [:field-id (mt/id :venues :category_id)] + [:field-id (mt/id :categories :name)]]) + (assoc :fk_field_id (mt/id :venues :category_id)) + (assoc :source :breakout)) + (-> (qp.test/col :venues :category_id) + (assoc :remapped_to (mt/format-name "name")) + (assoc :source :breakout)) + {:field_ref [:aggregation 0] + :source :aggregation + :display_name "Count" + :name "count" + :special_type :type/Number}]} + (-> (qp.test/format-rows-by [str int int] + (mt/run-mbql-query venues + {:aggregation [[:count]] + :breakout [$category_id] + :limit 3})) + qp.test/rows-and-cols + (update :cols (fn [[c1 c2 agg]] + [c1 c2 (dissoc agg :base_type)])))))))) + +(deftest nested-remapping-test + (mt/test-drivers (mt/normal-drivers-with-feature :nested-queries) + (mt/with-temp-objects + (data/create-venue-category-remapping! "Foo") + (is (= {:rows [["20th Century Cafe" 12 "Café"] + ["25°" 11 "Burger"] + ["33 Taps" 7 "Bar"] + ["800 Degrees Neapolitan Pizzeria" 58 "Pizza"]] + :cols [(-> (qp.test/col :venues :name) + (assoc :field_ref [:field-literal (mt/format-name "name") :type/Text]) + (dissoc :description :parent_id :visibility_type)) + (-> (qp.test/col :venues :category_id) + (assoc :remapped_to "Foo") + (assoc :field_ref [:field-literal (mt/format-name"category_id")]) + (dissoc :description :parent_id :visibility_type)) + (#'add-dimension-projections/create-remapped-col "Foo" (mt/format-name "category_id") :type/Text)]} + (-> (qp.test/format-rows-by [str int str] + (mt/run-mbql-query venues + {:source-query {:source-table (mt/id :venues) + :fields [[:field-id (mt/id :venues :name)] + [:field-id (mt/id :venues :category_id)]] + :order-by [[:asc [:field-id (mt/id :venues :name)]]] + :limit 4}})) + qp.test/rows-and-cols + (update :cols (fn [[c1 c2 c3]] + [c1 (update c2 :field_ref (comp vec butlast)) c3])))))))) (defn- select-columns "Focuses the given resultset to columns that return true when passed to `columns-pred`. Typically this would be done @@ -93,7 +148,7 @@ :order-by [[:asc $name]] :limit 4}))))) -;; Test that we can remap inside an MBQL nested query +;; Test that we can remap inside an MBQL query (datasets/expect-with-drivers (mt/normal-drivers-with-feature :foreign-keys :nested-queries) ["Kinaree Thai Bistro" "Ruen Pair Thai Restaurant" "Yamashiro Hollywood" "Spitz Eagle Rock" "The Gumbo Pot"] (mt/with-temp-objects diff --git a/test/metabase/task/sync_databases_test.clj b/test/metabase/task/sync_databases_test.clj index 27c5c3b14ce08367a050cdff1ce37cd8b1add416..024409ead39b87a0090fea036939c21892141aa5 100644 --- a/test/metabase/task/sync_databases_test.clj +++ b/test/metabase/task/sync_databases_test.clj @@ -5,7 +5,6 @@ (:require [clojure [string :as str] [test :refer :all]] - [expectations :refer [expect]] [metabase.models.database :refer [Database]] [metabase.task.sync-databases :as sync-db] [metabase.test.util :as tu] @@ -64,62 +63,59 @@ :data {"db-id" "<id>"}}]}) ;; Check that a newly created database automatically gets scheduled -(expect - [sync-job fv-job] - (with-scheduler-setup - (tt/with-temp Database [database {:engine :postgres}] - (current-tasks-for-db database)))) - +(deftest new-db-jobs-scheduled-test + (is (= [sync-job fv-job] + (with-scheduler-setup + (tt/with-temp Database [database {:engine :postgres}] + (current-tasks-for-db database)))))) ;; Check that a custom schedule is respected when creating a new Database -(expect - [(assoc-in sync-job [:triggers 0 :cron-schedule] "0 30 4,16 * * ? *") - (assoc-in fv-job [:triggers 0 :cron-schedule] "0 15 10 ? * 6#3")] - (with-scheduler-setup - (tt/with-temp Database [database {:engine :postgres - :metadata_sync_schedule "0 30 4,16 * * ? *" ; 4:30 AM and PM daily - :cache_field_values_schedule "0 15 10 ? * 6#3"}] ; 10:15 on the 3rd Friday of the Month - (current-tasks-for-db database)))) - +(deftest custom-schedule-test + (is (= [(assoc-in sync-job [:triggers 0 :cron-schedule] "0 30 4,16 * * ? *") + (assoc-in fv-job [:triggers 0 :cron-schedule] "0 15 10 ? * 6#3")] + (with-scheduler-setup + (tt/with-temp Database [database {:engine :postgres + :metadata_sync_schedule "0 30 4,16 * * ? *" ; 4:30 AM and PM daily + :cache_field_values_schedule "0 15 10 ? * 6#3"}] ; 10:15 on the 3rd Friday of the Month + (current-tasks-for-db database)))))) ;; Check that a deleted database gets unscheduled -(expect - [(update sync-job :triggers empty) - (update fv-job :triggers empty)] - (with-scheduler-setup - (tt/with-temp Database [database {:engine :postgres}] - (db/delete! Database :id (u/get-id database)) - (current-tasks-for-db database)))) +(deftest unschedule-deleted-database-test + (is (= [(update sync-job :triggers empty) + (update fv-job :triggers empty)] + (with-scheduler-setup + (tt/with-temp Database [database {:engine :postgres}] + (db/delete! Database :id (u/get-id database)) + (current-tasks-for-db database)))))) ;; Check that changing the schedule column(s) for a DB properly updates the scheduled tasks -(expect - [(assoc-in sync-job [:triggers 0 :cron-schedule] "0 15 10 ? * MON-FRI") - (assoc-in fv-job [:triggers 0 :cron-schedule] "0 11 11 11 11 ?")] - (with-scheduler-setup - (tt/with-temp Database [database {:engine :postgres}] - (db/update! Database (u/get-id database) - :metadata_sync_schedule "0 15 10 ? * MON-FRI" ; 10:15 AM every weekday - :cache_field_values_schedule "0 11 11 11 11 ?") ; Every November 11th at 11:11 AM - (current-tasks-for-db database)))) +(deftest schedule-change-test + (is (= [(assoc-in sync-job [:triggers 0 :cron-schedule] "0 15 10 ? * MON-FRI") + (assoc-in fv-job [:triggers 0 :cron-schedule] "0 11 11 11 11 ?")] + (with-scheduler-setup + (tt/with-temp Database [database {:engine :postgres}] + (db/update! Database (u/get-id database) + :metadata_sync_schedule "0 15 10 ? * MON-FRI" ; 10:15 AM every weekday + :cache_field_values_schedule "0 11 11 11 11 ?") ; Every November 11th at 11:11 AM + (current-tasks-for-db database)))))) ;; Check that changing one schedule doesn't affect the other -(expect - [sync-job - (assoc-in fv-job [:triggers 0 :cron-schedule] "0 15 10 ? * MON-FRI")] - (with-scheduler-setup - (tt/with-temp Database [database {:engine :postgres}] - (db/update! Database (u/get-id database) - :cache_field_values_schedule "0 15 10 ? * MON-FRI") - (current-tasks-for-db database)))) - -(expect - [(assoc-in sync-job [:triggers 0 :cron-schedule] "0 15 10 ? * MON-FRI") - fv-job] - (with-scheduler-setup - (tt/with-temp Database [database {:engine :postgres}] - (db/update! Database (u/get-id database) - :metadata_sync_schedule "0 15 10 ? * MON-FRI") - (current-tasks-for-db database)))) +(deftest schedule-changes-only-expected-test + (is (= [sync-job + (assoc-in fv-job [:triggers 0 :cron-schedule] "0 15 10 ? * MON-FRI")] + (with-scheduler-setup + (tt/with-temp Database [database {:engine :postgres}] + (db/update! Database (u/get-id database) + :cache_field_values_schedule "0 15 10 ? * MON-FRI") + (current-tasks-for-db database))))) + + (is (= [(assoc-in sync-job [:triggers 0 :cron-schedule] "0 15 10 ? * MON-FRI") + fv-job] + (with-scheduler-setup + (tt/with-temp Database [database {:engine :postgres}] + (db/update! Database (u/get-id database) + :metadata_sync_schedule "0 15 10 ? * MON-FRI") + (current-tasks-for-db database)))))) (deftest validate-schedules-test (testing "Check that you can't INSERT a DB with an invalid schedule" @@ -138,6 +134,32 @@ (db/update! Database (u/get-id database) k "2 CANS PER DAY")))))))) +(defrecord MockJobExecutionContext [job-data-map] + org.quartz.JobExecutionContext + (getMergedJobDataMap [this] (org.quartz.JobDataMap. job-data-map)) + + clojurewerkz.quartzite.conversion/JobDataMapConversion + (from-job-data [this] + (.getMergedJobDataMap this))) + +(deftest check-orphaned-jobs-removed-test + (testing "jobs for orphaned databases are removed during sync run" + (with-scheduler-setup + (doseq [sync-fn [sync-db/update-field-values sync-db/sync-and-analyze-database]] + (testing (str sync-fn) + (tt/with-temp Database [database {:engine :postgres}] + (let [db-id (:id database)] + (is (= [sync-job fv-job] + (current-tasks-for-db database))) + + (db/delete! Database :id db-id) + (let [ctx (MockJobExecutionContext. {"db-id" db-id})] + (sync-fn ctx)) + + (is (= [(update sync-job :triggers empty) + (update fv-job :triggers empty)] + (current-tasks-for-db database)))))))))) + ;;; +----------------------------------------------------------------------------------------------------------------+ ;;; | CHECKING THAT SYNC TASKS RUN CORRECT FNS | ;;; +----------------------------------------------------------------------------------------------------------------+