Automate file upload

Hello! I am receiving multiple CSV files via email, and I need to automate the process of uploading this data to Domo. Unfortunately, I can't use the email connector because they send many files in one email. However, I can upload all the files into a folder. Do you have a solution for this? Thanks!
Welcome!
Best Answer
-
I was only introduced to using Python inside Jupyter Notebooks in the past few months. But it has worked well. Somewhere out there are demo pages and videos. Short and sweet - I go to "Data" and on the left side, there are three dots. Click there and you will find Jupyter Notebooks. Within your Python code you can edit, check responses, debug portions at a time, save and schedule like you do Magic ETLs. It didn't take me long after watching some video I found through Google, to muster up the courage and create a "sample" workbook. I created a Python page and hit the run icon and it did what I told it. Then I got a little more advanced.
My first attempt at your question would be something like:- pip install pandas domo-py glob2
This would install the pandas, domo, and glob libraries. Then something like:
- import os
- import glob
- import pandas as pd
- from domo import Domo
- # Configuration for Domo API
- DOMO_CLIENT_ID = 'your_client_id'
- DOMO_CLIENT_SECRET = 'your_client_secret'
- DOMO_API_HOST = 'api.domo.com' # or use 'api.domo.com' for production
- # Folder where CSV files are located
- CSV_FOLDER_PATH = '/path/to/your/csv_folder'
- # Domo dataset details
- DATASET_ID = 'your_dataset_id' # Replace with your actual Dataset ID
- def read_csv_files(folder_path):
- # List all CSV files in the specified folder
- csv_files = glob.glob(os.path.join(folder_path, '*.csv'))
- # Initialize an empty list to hold DataFrames
- dataframes = []
- # Loop through the list of CSV files
- for file in csv_files:
- # Read each CSV file into a DataFrame
- df = pd.read_csv(file)
- # Append the DataFrame to the list
- dataframes.append(df)
- # Concatenate all DataFrames into a single DataFrame
- combined_df = pd.concat(dataframes, ignore_index=True)
- return combined_df
- def upload_to_domo(df, dataset_id):
- # Initialize Domo client
- domo = Domo(DOMO_CLIENT_ID, DOMO_CLIENT_SECRET, host=DOMO_API_HOST)
- # Convert DataFrame to CSV for upload
- csv_data = df.to_csv(index=False)
- # Upload the CSV data to Domo
- domo.datasets.data_import(dataset_id, csv_data, content_type='text/csv')
- print(f'Data uploaded to Domo dataset {dataset_id} successfully.')
- # Main execution
- if __name__ == '__main__':
- # Read and combine CSV files
- combined_df = read_csv_files(CSV_FOLDER_PATH)
- # Upload combined DataFrame to Domo
- upload_to_domo(combined_df, DATASET_ID)
** Was this post helpful? Click Agree or Like below. **
** Did this solve your problem? Accept it as a solution! **1
Answers
-
Even though your e-mail has multiple files, you can set up multiple forwarding rules and use the dataset via e-mail connector to handle each file. You can use the attachment name expression field to differentiate which file you want to connect to. Check out the documentation in this KB article.
https://domo-support.domo.com/s/article/360042931954?language=en_US**Check out my Domo Tips & Tricks Videos
**Make sure toany users posts that helped you.
**Please mark as accepted the ones who solved your issue.2 -
Thank you @Jones01 and @MarkSnodgrass ! I will try both methods.😀
0 -
I like @MarkSnodgrass suggestion. You could also use Python scripts using the imaplib, email, and os libraries to automate the download of attachments from emails. Alternatively, third-party tools could possibly automate the download process.
Domo Workbench could probably do it, but you would need to handle the specifics for files, subfolders, and transformations.
** Was this post helpful? Click Agree or Like below. **
** Did this solve your problem? Accept it as a solution! **1 -
Thanks @ArborRose! I will look into using python for this automation. I will probably ask some follow up questions on that.😁
0 -
I was only introduced to using Python inside Jupyter Notebooks in the past few months. But it has worked well. Somewhere out there are demo pages and videos. Short and sweet - I go to "Data" and on the left side, there are three dots. Click there and you will find Jupyter Notebooks. Within your Python code you can edit, check responses, debug portions at a time, save and schedule like you do Magic ETLs. It didn't take me long after watching some video I found through Google, to muster up the courage and create a "sample" workbook. I created a Python page and hit the run icon and it did what I told it. Then I got a little more advanced.
My first attempt at your question would be something like:- pip install pandas domo-py glob2
This would install the pandas, domo, and glob libraries. Then something like:
- import os
- import glob
- import pandas as pd
- from domo import Domo
- # Configuration for Domo API
- DOMO_CLIENT_ID = 'your_client_id'
- DOMO_CLIENT_SECRET = 'your_client_secret'
- DOMO_API_HOST = 'api.domo.com' # or use 'api.domo.com' for production
- # Folder where CSV files are located
- CSV_FOLDER_PATH = '/path/to/your/csv_folder'
- # Domo dataset details
- DATASET_ID = 'your_dataset_id' # Replace with your actual Dataset ID
- def read_csv_files(folder_path):
- # List all CSV files in the specified folder
- csv_files = glob.glob(os.path.join(folder_path, '*.csv'))
- # Initialize an empty list to hold DataFrames
- dataframes = []
- # Loop through the list of CSV files
- for file in csv_files:
- # Read each CSV file into a DataFrame
- df = pd.read_csv(file)
- # Append the DataFrame to the list
- dataframes.append(df)
- # Concatenate all DataFrames into a single DataFrame
- combined_df = pd.concat(dataframes, ignore_index=True)
- return combined_df
- def upload_to_domo(df, dataset_id):
- # Initialize Domo client
- domo = Domo(DOMO_CLIENT_ID, DOMO_CLIENT_SECRET, host=DOMO_API_HOST)
- # Convert DataFrame to CSV for upload
- csv_data = df.to_csv(index=False)
- # Upload the CSV data to Domo
- domo.datasets.data_import(dataset_id, csv_data, content_type='text/csv')
- print(f'Data uploaded to Domo dataset {dataset_id} successfully.')
- # Main execution
- if __name__ == '__main__':
- # Read and combine CSV files
- combined_df = read_csv_files(CSV_FOLDER_PATH)
- # Upload combined DataFrame to Domo
- upload_to_domo(combined_df, DATASET_ID)
** Was this post helpful? Click Agree or Like below. **
** Did this solve your problem? Accept it as a solution! **1 -
That's great! thank you @ArborRose! 😀
0
Welcome!
Welcome!
Categories
- All Categories
- 2K Product Ideas
- 2K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 311 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 3.8K Transform
- 659 Datasets
- 116 SQL DataFlows
- 2.2K Magic ETL
- 815 Beast Mode
- 3.3K Visualize
- 2.5K Charting
- 82 App Studio
- 45 Variables
- 776 Automate
- 190 Apps
- 481 APIs & Domo Developer
- 82 Workflows
- 23 Code Engine
- 40 AI and Machine Learning
- 20 AI Chat
- 1 AI Playground
- 1 AI Projects and Models
- 18 Jupyter Workspaces
- 410 Distribute
- 120 Domo Everywhere
- 280 Scheduled Reports
- 10 Software Integrations
- 144 Manage
- 140 Governance & Security
- 8 Domo Community Gallery
- 48 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 114 Community Announcements
- 4.8K Archive