Integrate Pickle Files with Edilitics
Learn how to integrate Pickle files with Edilitics for seamless data workflows. Follow this guide to securely configure and manage Pickle integrations.
Pickle is a Python utility for serializing and deserializing Python objects. This functionality converts complex data structures, such as dictionaries and lists, into a byte stream for storage or transmission and recreates the original objects when needed. It is particularly useful for saving Python-readable data and sharing data between applications.
Within Edilitics, Pickle files are used exclusively as data sources, enabling seamless data ingestion for advanced analytics. This guide provides a comprehensive, step-by-step approach to integrating Pickle files into Edilitics while ensuring data integrity and performance efficiency.
Before You Begin
Ensure the following prerequisites are met:
- File Size Limit: Pickle files must not exceed 100 MB.
- Recommended Format: While various data structures are supported, structuring data in a tabular format (like DataFrames) allows for more efficient analysis within Edilitics.
- Usage Constraints:
- Pickle files are only supported as data sources, not destinations.
- Workflows using Pickle files:
- Support full loads via "Schedule as Once" only.
- Support "Schedule as Once" in Transform.
- Do not support auto updates or data refreshes in Visualize.
File Security and Management
Edilitics ensures secure and efficient handling of Pickle files:
- Security Scans: All uploaded files undergo validation for potential risks and data accuracy.
- Encryption: Files are encrypted during storage and decrypted only during user access or workflow execution (Transform, Visualize).
- Permanent Deletion: Upon deleting an integration, the associated file is permanently removed from Edilitics systems to ensure data privacy compliance.
Supported Data Structures
Edilitics supports the following data structures within Pickle files:
| Data Type | Description | Example |
|---|---|---|
| Lists | An ordered collection of elements of any data type. | [1, 2, 3, 4, 5] |
| Dictionaries | A collection of key-value pairs, with unique keys. | {"name": "John", "age": 30, "email": "john@example.com"} |
| Tuples | An ordered, immutable collection of elements of any data type. | (10.0, 20.0) |
| Sets | An unordered collection of unique elements. | {1, 2, 3, 4, 5} |
| DataFrames | Two-dimensional tabular data with labeled rows and columns. | Sales data with columns for Date, Product, Quantity, and Price. |
Note: Structuring Pickle data as DataFrames simplifies analysis and improves performance within Edilitics.
Steps to Integrate Pickle Files
Step 1: Add the Pickle Connector
-
Navigate to the Integrations module in Edilitics.
-
Click on New Integration.
-
Search for and select the Pickle connector.
Step 2: Configure the Integration
Fill in the following details to configure the Pickle integration:
| Field Name | Details |
|---|---|
| Integration Title | A unique identifier for your integration. |
| Integration Description | A concise summary of the Pickle data being integrated. |
| AI Driven Column Insights | Enable this feature to leverage AI for generating metadata-driven column descriptions automatically. If disabled, column descriptions can be manually curated to align with specific business requirements. Any user-modified descriptions are treated as authoritative and remain immutable. |
| File Upload | Upload the Pickle file directly from your local storage (must be ≤ 100 MB). |
Step 3: Validate and Save
- Click Test & Save Connection to validate the uploaded file.
- Edilitics scans the file for schema compliance and security validation.
- Upon successful validation, the file is securely encrypted and saved for use in workflows.
Need help? Email support@edilitics.com with your workspace, job ID, and context. We reply within one business day.
Last updated on
Integrate PDF files with Edilitics for tabular data extraction. Follow this guide to securely configure and manage PDF data for analytics workflows.
Data Store Integrations
Connect Redis and SQLite to Edilitics as data sources. Both support DQ profiling and AI column insights. SQLite also supports AskEdi sessions.