Best Practices for Avionté DataLink

Avionte’s DataLink is a powerful data integration service designed for large to enterprise-level organizations with advanced analytical needs. It provides direct access to query your Insights data using SQL, either through the DataLink user interface platform or external tools. Using the DataLink Account, you can access a raw data feed that integrates seamlessly with your existing Business Intelligence (BI) tools for generating custom reports, advanced analytics, and complex data visualizations.

  • The DataLink Account is tailored for organizations that:
  • Require advanced data integration to combine DataLink data with their own or third-party datasets.
  • Need full control over their data for custom reporting or detailed analysis.
  • Have strong internal data teams capable of implementing and managing technical integrations.

By using the DataLink Account, you gain the flexibility to explore, aggregate, and analyze your data through SQL, allowing you to unlock key metrics across themes like gross profit, invoicing, applicant statuses, and employee activities.

Table of Contents

Best Practices for Using the DataLink Account

Before setting up your connection, it’s important to understand key best practices to ensure efficient, secure, and scalable data usage. The following guidelines will help you optimize performance, manage costs, and structure your data workflows effectively. Once you’re familiar with these principles, follow the steps in the Setting Up Your Avionté DataLink Account section to connect to your data.

Optimizing Data Extraction and Querying

Avionté DataLink offers powerful tools for analyzing your data, but how you extract and manage that data has a direct impact on speed, efficiency, and cost. Effective data management ensures that your queries run smoothly, remain cost-efficient, and scale well over time.

To maximize efficiency, it’s important to structure your data extraction processes correctly. The next section explains how ETL (Extract, Transform, Load) processes can optimize performance and why third-party tools are often the preferred choice.

Understanding ETL for Efficient Data Management

When working with the DataLink Account, an essential part of managing your data efficiently is deciding where and how to process it. This is where ETL comes in.

What is ETL?

ETL (Extract, Transform, Load) is a core process for preparing data for analysis. It involves:

  • Extracting raw data from a source system.
  • Transforming it into a clean and usable format.
  • Loading it into a system for analysis.

By following ETL recommended practices, you ensure that your data is reliable, organized, and ready for use in tools like Power BI or Tableau. Proper ETL processes also keep your Snowflake environment running smoothly by reducing query loads and simplifying analysis workflows.

A key decision in ETL workflows is where to process and store extracted data. While some transformations can be done within DataLink Account, pulling data into your own warehouse and using third-party tools for processing often provides better performance and cost efficiency.

Why Extract Data to Local Environments or Third-Party Tools?

While DataLink Account is great for analytics, running too many frequent or complex queries directly on the platform can lead to higher costs and slower performance. Extracting data into local environments or external tools has several benefits:

  • Cost Savings: Extracting only the necessary data reduces DataLink processing costs and reduces the expense of repeated or complex queries.
  • Faster Performance: Offloading high-frequency or operational queries to a local environment improves speed and responsiveness.
  • More Flexibility: External environments allow you to easily combine DataLink data with other datasets and perform custom transformations tailored to specific needs.
  • Improved Responsiveness: Running analyses locally eliminates network delays, resulting in faster reporting and operational results.
  • Better Data Control: External extraction enables additional layers of access control, data security, and the creation of snapshots for historical tracking or audits.

Key ETL Best Practices

Extract Only What You Need

  • Avoid using SELECT * queries; instead, select only the specific columns and rows you require:
    SELECT customer_id, customer_name, email  
    FROM customers  
    WHERE country = 'USA'; 
  • Apply filters (WHERE, LIMIT) to exclude unnecessary data and minimize extraction volumes.

Keep Queries Simple During Extraction

  • Avoid complex joins in DataLink Account during extraction. Instead, extract tables separately and join them later in tools like Power BI or Tableau.
  • If joins are unavoidable, limit data size using appropriate filters.

Use Incremental Extraction

  • Extract only new or updated rows based on a timestamp column. The example below demonstrates how to retrieve records that have been inserted or updated in the last day:

SELECT a.* 
FROM DIMAPPLICANT a
WHERE a.DW_INSERTEDDATE >= DATEADD(day, -1, CURRENT_TIMESTAMP())
OR a.DW_UPDATEDDATE >= DATEADD(day, -1, CURRENT_TIMESTAMP())

This approach ensures that you only pull the most recent changes, reducing the amount of data processed and improving efficiency.

Optimize Data Transfers

  • Enable DataLink’s compression features to reduce the size of data transfer.
  • Use efficient file formats like CSV or Parquet for large datasets.

Test and Monitor Your Queries

  • Run queries on smaller datasets before running on larger tables.
  • Use DataLink’s Query History or Profiler to check performance and troubleshoot issues.

Example Workflow

Following a structured workflow can help you extract data efficiently while maintaining performance and cost-effectiveness. Here’s a recommended process:

  1. Identify required data: Start by reviewing the product’s schema or Data Dictionary to determine the tables and columns you need for analysis.
  2. Apply filters to minimize data volume: Use WHERE clauses or other filtering techniques to extract only relevant data.
  3. Leverage incremental loading: Use timestamp columns or Snowflake Streams to pull only new or updated data. For example:
    SELECT a.* 
    FROM DIMAPPLICANT a
    WHERE a.DW_INSERTEDDATE >= DATEADD(day, -1, CURRENT_TIMESTAMP())
    OR a.DW_UPDATEDDATE >= DATEADD(day, -1, CURRENT_TIMESTAMP()) 
  4. Extract data in manageable batches: For large datasets, use pagination or batch limits to process data in smaller chunks.

By following this workflow, you can optimize your data extraction process while keeping DataLink responsive and optimized for its core analytical tasks.

The next section will guide you through the process of setting up your DataLink Account and connecting to your data environment.

Setting Up Your Avionté DataLink Account

Follow the steps below to set up your connection and start accessing your data.

Step 1: Obtain Your Unique Connectivity Information

After signing up, you will receive an encrypted email from Avionté containing the following details to set up your connection:

Server Name/URL: The unique server location where your data is hosted.

User: Your account-specific username to access your data environment.

Password: A secure password for accessing your data.

Note: If you have not received this email, or require assistance, please contact your Client Manager or our support team at support@avionte.com. For security reasons, you will be prompted to change your password on first login.

Step 2: Connect to Your Data

The Avionté DataLink product is presented to you via the third-party Snowflake application.

2.1 Accessing Snowflake

  1. Open a browser and go to the URL provided in your onboarding email.
  2. On the Snowflake login screen, enter your Username (Name) and Password from the onboarding email.         DLBP - 001.png
  3. Click Sign In.

2.2 Securing Your Account

After first login, you will be prompted to:

  1. Change your password. Choose a strong, unique password and confirm the change.
  2. Set up Multi-Factor Authentication (MFA) for your account:DLBP - 002.png                                       Click the Continue button to begin the MFA enrollment process. Follow the on-screen instructions to set up the MFA.DLBP - 003.png

Explore Your Data

Once you’ve logged in and secured your account, you can begin exploring and analyzing your data.

Viewing Table and Column Information

  1. In the left-hand menu, click Data and select Databases.DLBP - 004.png
  2. Locate your database, which will be named PROD_ANALYTICS_{Customer Specific Code}. The {Customer Specific Code} is unique to your account and will be part of your username. For example, if your username is USER_ABC, your database will be named PROD_ANALYTICS_ABC."
  3. Click on DBO, then navigate to Dynamic Tables.
  4. Click on any table name to inspect its structure and columns.DLBP - 005.png

Running SQL Queries

While Avionté recommends extracting the DataLink data into your own Data Warehouse environment for ongoing use, you can initially familiarize yourself with the data structures by running SQL queries directly from the User Interface. Follow these steps:

  1. In the left-hand menu, click Home.
  2. From the main screen, select Query data.
  3. Use Snowflake’s query editor to write and execute SQL queries, retrieve data, or perform analyses.DLBP - 006.png

Key Tables You Will Use

To help you get started, here’s an overview of some core tables you’ll likely use. For a complete reference, consult the Data Dictionary, which details column names, data types, and relationships across tables.

Fact Tables (Quantitative Data):

  • FCTACTIVITY: Records applicant, job, and contact activities.
  • PLACEMENT_FACT: Provides details on placement-related metrics.
  • GROSSPROFIT_FACT: Contains gross profit figures tied to placements.
  • REVENUE_ACTIVITY_FACT: Tracks revenue and job activity data.

Dimension Tables (Descriptive Data):

  • DIMDATE: Contains dates, useful for time-based analyses.
  • DIMCOMPANY: Information about the company records.
  • DIMAPPLICANT: Holds applicant details linked to activities.
  • DIMOFFICE: Office-specific details, linking activities to offices.

Example Queries:

  • Count of Records in FCTACTIVITY:
    SELECT COUNT(1) FROM FCTACTIVITY;
  • Filter Placements by Date in PLACEMENT_FACT:
    SELECT * FROM PLACEMENT_FACT WHERE DATEKEY BETWEEN '20230101' AND '20231231';
    The Snowflake interface allows you to run SQL queries, explore tables, and view data directly. While Snowflake provides a robust interface for querying data, the true power of your DataLink Account lies in integrating this data with your existing analytics tools.

Integrating with Third-Party BI Tools

To maximize the potential of your DataLink data, we recommend integrating with third-party BI tools such as Tableau, Power BI, or Domo. These tools provide advanced visualization capabilities, enhancing the analysis and presentation of your data. This guide provides setup instructions for these tools, but it is not an exhaustive list — you may use alternate BI tools that support Snowflake connections. Most tools provide a straightforward Connect to Snowflake option. Use your updated credentials (Server Name/URL, Username, and new password) to establish a connection.

Setup and Login Instructions for Specific Tools:

  • Power BI Integration: Step-by-step guide for connecting Power BI to your Snowflake account.
  • Tableau Integration: Detailed instructions on setting up Tableau with Snowflake.
  • Domo Integration: Step-by-step guide for connecting Domo to your Snowflake account

Each section includes step-by-step instructions to help you integrate these tools with your Avionté DataLink data.

Power BI Integration

Note: Power BI Online does not currently support direct integration with Snowflake. To connect to Snowflake, you must complete the setup using Power BI Desktop. Once your reports are created in Power BI Desktop, you can publish them to Power BI Online for sharing.

Follow these steps to connect Power BI to your Snowflake account:

  1. Launch the Power BI application and log in.
  2. Click on the Home tab.
  3. Select Get data > Database > Snowflake.DLBP - 007.png
  4. In the Snowflake connector window, input your Server (Snowflake Server Name/URL from your onboarding email) and, optionally, Warehouse (formatted as PROD_WH_{Customer Specific Code}, the name of your Snowflake computing warehouse).DLBP - 008.png
  5. Optionally, enter values under Advanced options.
  6. Click OK.
  7. In the authentication prompt, select Basic Authentication.
  8. Enter your Username and your updated Password, then click Connect.
  9. After a successful connection, the Navigator window will appear. Browse through the list of available tables in your Snowflake database.
  10. Locate your database which will be named PROD_ANALYTICS_{Customer Specific Code}.
  11. Click on DBO, then navigate to Dynamic Tables.
  12. Select the desired table (e.g. FCTACTIVITY) to inspect its structure and load the data.
  13. Select the datasets you need and click Load to import the data into Power BI Desktop.

Once connected, you can start exploring your data and building visualizations in Power BI Desktop.

Tableau Integration

Follow these steps to connect Tableau to your Snowflake account:

  1. Open Tableau Desktop.
  2. In the Connect left-hand menu, under the To a Server section, select Snowflake.DLBP - 009.png
  3. If you don’t have the Snowflake driver installed, Tableau may prompt you to download it. Follow the on-screen instructions to complete the installation before proceeding.
  4. In the connection dialog box on the General tab, enter:
    Server: The name of the Server (Snowflake Server Name/URL from your onboarding email)
    Role (optional):
    Warehouse: The name of your Snowflake computing warehouse, formatted as PROD_WH_{Customer Specific Code}.
    Authentication: Select Username and Password as the authentication method.DLBP - 010.png
  5. Complete the login steps for the selected authentication method by entering:
    Username: Your username from the onboarding email.
    Password: Your updated Snowflake password.
  6. Click Sign In.
  7. Once you’ve entered your Snowflake login credentials, Tableau will connect to Snowflake.
  8. After connecting, Tableau will display the available databases and schemas. Select your database (e.g., PROD_ANALYTICS_{Customer Specific Code}) and schema (e.g., DBO).
  9. Drag and drop tables into the Tableau workspace to create your data source.

Once connected, you can start exploring your Snowflake data and building visualizations in Tableau Desktop.

Domo Integration

Follow these steps to connect Domo to your Snowflake account:

  1. Log in to your Domo instance.
  2. Click on the Data tab in the top navigation (or click More > Data).
  3. Select Connect Data > Connectors, then search for Snowflake in the connector list.DLBP - 011.pngDLBP - 012.png
  4. Click on the Snowflake Connector.DLBP - 013.png
  5. In the Account Settings section, enter the following details from your onboarding email:
    1. Host: Your Snowflake Server Name/URL.
    2. Warehouse: Specify the warehouse for query execution (e.g. PROD_WH_{Customer Specific Code}).
    3. Database: Select the database (e.g., PROD_ANALYTICS_{Customer Specific Code}).
    4. Schema: Use DBO or another relevant schema.
  6. Enter your Username as the username and use your updated password for authentication.
  7. Choose Username & Password as Authentication Method and input your credentials.
  8. Click Test Connection to verify the setup.
  9. After a successful connection, Domo will display the available databases and schemas from your Snowflake instance.
  10. Locate your database (e.g., PROD_ANALYTICS_{Customer Specific Code}).
  11. Select the schema (e.g., DBO) and browse through the available tables.
  12. Choose the dataset(s) you need and configure the import settings.
  13. Click Save & Run to load the data into Domo for visualization and analysis.

Articles in this section

Was this article helpful?
0 out of 0 found this helpful
Share

Comments

0 comments

Please sign in to leave a comment.