Best Practices
See Connecting to a Data Source using an SSH Tunnel for more information about connecting to data sources through an SSH tunnel.
Having reliable data sources with strong data governance will improve the end user and admin experiences. It is recommended to create a Matik user profile instead of connecting to an existing user. This prevents you from having to re-enter credentials if user changes occur and also limits table ACLs to Matik-specific content.
Presto
Redshift
Athena
You may also access Athena using role-based authentication. In this case, your enterprise doesn't need to create another user in their AWS instance but instead can create a specific role that Matik can assume in order to access the data. The step-by-step instructions are as follows:
1. Navigate to the "Data Sources" tab in Matik.
2. Click the "Add Data Source" button
3. Select "Athena" from the Data Source Type dropdown.
4. Navigate to the bottom of the data source and click "Access via role"
5. Copy Matik's account id.
6. In the AWS console, navigate to the IAM service and click the "Create Role" button.
7. Under the trusted entity type selection, select AWS Account.
8. Under the "An AWS Account" heading, select "Another AWS Account" and paste the copied Matik account id into the field.
9. Click the "Require external id" checkbox under the options section.
10. Navigate to Matik, copy the external ID from the data source form and enter it into the external id field.
11. Add permission policies to the role to give it access to the right resources in your AWS instance. Matik will need access to Athena as well as S3 for dealing with query results.
12. Give the role a name and optional description and create it.
13. Navigate to the role that you created in step 12 and copy the arn.
14. Navigate to Matik and copy the role arn into the field.
15. Finish filling out the data source form in Matik with the rest of the information for the Athena connection and click "Test Connection" to ensure everything is working.
Google BigQuery
When using Google BigQuery please ensure the IAM user you are using for your Matik connection has the BigQuery role permissions: "BigQuery Data Viewer" and "BigQuery Metadata Viewer".
You may use either OAuth2 or "Service Account Info" as your Google authentication method. If using your Google Cloud Service Account info, you will need to enter a Service Account JSON when adding your data source.
Your "Service Account JSON" should be in the following format:
PostGres
MySQL
Snowflake
Snowflake can be accessed via Login with a username and password, RSA Key Pair, or OAuth2
Login:
RSA Key Pair:
Please enter your Login Name in the Snowflake username field. Note that Login Name can be different than a users "Name" in Snowflake. You can query DESC USER <User Name> from the Snowflake console to view your Login Name.
When entering your Private Key be sure to include the Header and Footer text included with the generated RSA key file. (E.G. ...-----BEGIN ENCRYPTED PRIVATE KEY----- AND -----END ENCRYPTED PRIVATE KEY-----)
-----BEGIN ENCRYPTED PRIVATE KEY-----
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
...
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
-----END ENCRYPTED PRIVATE KEY-----
If you experience an error message stating JWT token is invalid, check out this Snowflake Help Article.
OAuth2:
Comments
0 comments
Please sign in to leave a comment.