Automate file upload

Hello! I am receiving multiple CSV files via email, and I need to automate the process of uploading this data to Domo. Unfortunately, I can't use the email connector because they send many files in one email. However, I can upload all the files into a folder. Do you have a solution for this? Thanks!

Best Answer

  • ArborRose
    ArborRose Coach
    Answer ✓

    I was only introduced to using Python inside Jupyter Notebooks in the past few months. But it has worked well. Somewhere out there are demo pages and videos. Short and sweet - I go to "Data" and on the left side, there are three dots. Click there and you will find Jupyter Notebooks. Within your Python code you can edit, check responses, debug portions at a time, save and schedule like you do Magic ETLs. It didn't take me long after watching some video I found through Google, to muster up the courage and create a "sample" workbook. I created a Python page and hit the run icon and it did what I told it. Then I got a little more advanced.

    My first attempt at your question would be something like:

    pip install pandas domo-py glob2
    

    This would install the pandas, domo, and glob libraries. Then something like:

    import os
    import glob
    import pandas as pd
    from domo import Domo


    # Configuration for Domo API
    DOMO_CLIENT_ID = 'your_client_id'
    DOMO_CLIENT_SECRET = 'your_client_secret'
    DOMO_API_HOST = 'api.domo.com' # or use 'api.domo.com' for production


    # Folder where CSV files are located
    CSV_FOLDER_PATH = '/path/to/your/csv_folder'


    # Domo dataset details
    DATASET_ID = 'your_dataset_id' # Replace with your actual Dataset ID

    def read_csv_files(folder_path):

    # List all CSV files in the specified folder
    csv_files = glob.glob(os.path.join(folder_path, '*.csv'))

    # Initialize an empty list to hold DataFrames
    dataframes = []


    # Loop through the list of CSV files
    for file in csv_files:

    # Read each CSV file into a DataFrame
    df = pd.read_csv(file)

    # Append the DataFrame to the list
    dataframes.append(df)


    # Concatenate all DataFrames into a single DataFrame
    combined_df = pd.concat(dataframes, ignore_index=True)
    return combined_df


    def upload_to_domo(df, dataset_id):

    # Initialize Domo client
    domo = Domo(DOMO_CLIENT_ID, DOMO_CLIENT_SECRET, host=DOMO_API_HOST)

    # Convert DataFrame to CSV for upload
    csv_data = df.to_csv(index=False)

    # Upload the CSV data to Domo
    domo.datasets.data_import(dataset_id, csv_data, content_type='text/csv')
    print(f'Data uploaded to Domo dataset {dataset_id} successfully.')


    # Main execution
    if __name__ == '__main__':

    # Read and combine CSV files
    combined_df = read_csv_files(CSV_FOLDER_PATH)

    # Upload combined DataFrame to Domo
    upload_to_domo(combined_df, DATASET_ID)

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **

Answers

  • Jones01
    Jones01 Contributor

    @vaco take a look at the Amazon S3 Advanced connector. I think this will pickup multiple files.

    https://domo-support.domo.com/s/article/360043436353?language=en_US

  • Even though your e-mail has multiple files, you can set up multiple forwarding rules and use the dataset via e-mail connector to handle each file. You can use the attachment name expression field to differentiate which file you want to connect to. Check out the documentation in this KB article.


    https://domo-support.domo.com/s/article/360042931954?language=en_US

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • vaco
    vaco Member

    Thank you @Jones01 and @MarkSnodgrass ! I will try both methods.😀

  • I like @MarkSnodgrass suggestion. You could also use Python scripts using the imaplib, email, and os libraries to automate the download of attachments from emails. Alternatively, third-party tools could possibly automate the download process.

    Domo Workbench could probably do it, but you would need to handle the specifics for files, subfolders, and transformations.

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **

  • vaco
    vaco Member
    edited September 3

    Thanks @ArborRose! I will look into using python for this automation. I will probably ask some follow up questions on that.😁

  • ArborRose
    ArborRose Coach
    Answer ✓

    I was only introduced to using Python inside Jupyter Notebooks in the past few months. But it has worked well. Somewhere out there are demo pages and videos. Short and sweet - I go to "Data" and on the left side, there are three dots. Click there and you will find Jupyter Notebooks. Within your Python code you can edit, check responses, debug portions at a time, save and schedule like you do Magic ETLs. It didn't take me long after watching some video I found through Google, to muster up the courage and create a "sample" workbook. I created a Python page and hit the run icon and it did what I told it. Then I got a little more advanced.

    My first attempt at your question would be something like:

    pip install pandas domo-py glob2
    

    This would install the pandas, domo, and glob libraries. Then something like:

    import os
    import glob
    import pandas as pd
    from domo import Domo


    # Configuration for Domo API
    DOMO_CLIENT_ID = 'your_client_id'
    DOMO_CLIENT_SECRET = 'your_client_secret'
    DOMO_API_HOST = 'api.domo.com' # or use 'api.domo.com' for production


    # Folder where CSV files are located
    CSV_FOLDER_PATH = '/path/to/your/csv_folder'


    # Domo dataset details
    DATASET_ID = 'your_dataset_id' # Replace with your actual Dataset ID

    def read_csv_files(folder_path):

    # List all CSV files in the specified folder
    csv_files = glob.glob(os.path.join(folder_path, '*.csv'))

    # Initialize an empty list to hold DataFrames
    dataframes = []


    # Loop through the list of CSV files
    for file in csv_files:

    # Read each CSV file into a DataFrame
    df = pd.read_csv(file)

    # Append the DataFrame to the list
    dataframes.append(df)


    # Concatenate all DataFrames into a single DataFrame
    combined_df = pd.concat(dataframes, ignore_index=True)
    return combined_df


    def upload_to_domo(df, dataset_id):

    # Initialize Domo client
    domo = Domo(DOMO_CLIENT_ID, DOMO_CLIENT_SECRET, host=DOMO_API_HOST)

    # Convert DataFrame to CSV for upload
    csv_data = df.to_csv(index=False)

    # Upload the CSV data to Domo
    domo.datasets.data_import(dataset_id, csv_data, content_type='text/csv')
    print(f'Data uploaded to Domo dataset {dataset_id} successfully.')


    # Main execution
    if __name__ == '__main__':

    # Read and combine CSV files
    combined_df = read_csv_files(CSV_FOLDER_PATH)

    # Upload combined DataFrame to Domo
    upload_to_domo(combined_df, DATASET_ID)

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **

  • vaco
    vaco Member

    That's great! thank you @ArborRose! 😀