Skip to main content

Integrations Overview

Erdo integrations connect your AI agents to various data sources including databases, APIs, and file systems. This enables agents to access, analyze, and work with your existing data.

How Integrations Work

Integrations provide a standardized way for agents to:
  • Authenticate with external services
  • Discover available data resources
  • Access data through a unified interface
  • Transform data for AI analysis

Built-in Integrations

Databases

PostgreSQL
  • Full SQL query support
  • Schema discovery
  • Relationship mapping
MySQL
  • Complete database access
  • Table and view discovery
  • Performance optimization
MongoDB
  • Document collection access
  • Aggregation pipeline support
  • Index optimization
SQLite
  • Local database support
  • File-based storage
  • Lightweight queries

APIs & Services

Salesforce
  • CRM data access
  • Object and field discovery
  • Real-time sync capabilities
HubSpot
  • Marketing and sales data
  • Contact and deal management
  • Pipeline analytics
Stripe
  • Payment and subscription data
  • Transaction history
  • Customer analytics
Google Analytics
  • Website traffic data
  • User behavior analysis
  • Conversion tracking

File Systems

Local Files
  • CSV, JSON, Excel support
  • Directory scanning
  • File metadata extraction
AWS S3
  • Cloud storage access
  • Bucket and object discovery
  • Secure file handling
Google Drive
  • Document access
  • Folder structure mapping
  • Collaborative file handling

Using Integrations

Setting Up an Integration

  1. Choose Integration Type Select from available integrations in your dashboard
  2. Configure Authentication Provide credentials (API keys, OAuth, database connections)
  3. Test Connection Verify connectivity and permissions
  4. Discover Resources Let Erdo scan and catalog available data

In Agent Workflows

Use integrations in your agents:
from erdo import Agent, state
from erdo.actions import llm, memory

# Agent with database integration
data_analyzer = Agent(
    name="sales_analyzer",
    description="Analyzes sales data from CRM"
)

# Query data from integrated source
query_step = data_analyzer.step(
    llm.message(
        model="claude-sonnet-4",
        query=f"Analyze sales trends from: {state.database_query_result}",
        system_prompt="You are a sales analyst. Provide insights on sales performance."
    )
)

Data Access Patterns

Common patterns for accessing integrated data: Direct Queries
# SQL databases
query_result = integration.query("SELECT * FROM sales WHERE date > '2024-01-01'")

# NoSQL databases
documents = integration.find({"status": "active"})

# APIs
data = integration.get("/api/customers")
Resource Discovery
# List available tables/collections
resources = integration.discover_resources()

# Get schema information
schema = integration.get_schema("customers")

Authentication Methods

OAuth 2.0

For modern APIs requiring secure token-based authentication:
oauth_config = {
    "client_id": "your_client_id",
    "client_secret": "your_client_secret",
    "scopes": ["read:data", "read:metadata"]
}

API Keys

For services using API key authentication:
api_config = {
    "api_key": "your_api_key",
    "api_secret": "your_api_secret"  # if required
}

Database Connections

For direct database access:
db_config = {
    "host": "localhost",
    "port": 5432,
    "database": "mydb",
    "username": "user",
    "password": "password"
}

Data Security

Credential Management

  • Credentials are encrypted at rest
  • Secure transmission using TLS
  • Role-based access controls
  • Audit logging for all access

Data Privacy

  • Data remains in your environment
  • No unnecessary data copying
  • Configurable data retention
  • GDPR/CCPA compliance support

Network Security

  • VPC/private network support
  • IP whitelisting capabilities
  • Secure tunneling options
  • Certificate-based authentication

Best Practices

Performance Optimization

Query Optimization
  • Use indexed columns in WHERE clauses
  • Limit result sets appropriately
  • Cache frequently accessed data
  • Use connection pooling
Resource Management
  • Close connections properly
  • Monitor connection limits
  • Implement retry logic
  • Handle timeouts gracefully

Error Handling

Connection Issues
try:
    data = integration.query(sql_query)
except ConnectionError as e:
    # Handle connection failures
    logger.error(f"Database connection failed: {e}")
    # Implement fallback or retry logic
Authentication Failures
try:
    api_data = integration.get("/api/data")
except AuthenticationError as e:
    # Handle auth failures
    logger.error(f"Authentication failed: {e}")
    # Refresh tokens or re-authenticate

Monitoring & Maintenance

Health Checks
  • Regular connection testing
  • Performance monitoring
  • Error rate tracking
  • Usage analytics
Maintenance Tasks
  • Credential rotation
  • Schema change detection
  • Performance optimization
  • Security updates

Creating Custom Integrations

For data sources not covered by built-in integrations, you can create custom integrations:
  1. Implement Provider Interface Create a provider that implements authentication and data access
  2. Define Resource Discovery Specify how to discover and catalog available data
  3. Test Integration Comprehensive testing with real data sources
  4. Deploy and Register Make available for use in agents
See the Creating Custom Integrations guide for detailed instructions.

Troubleshooting

Common Issues

Connection Timeouts
  • Increase timeout values
  • Check network connectivity
  • Verify firewall settings
Authentication Errors
  • Verify credentials are correct
  • Check token expiration
  • Confirm required permissions
Data Access Issues
  • Validate query syntax
  • Check data permissions
  • Verify resource exists
Performance Problems
  • Optimize queries
  • Add appropriate indexes
  • Implement caching
  • Use connection pooling

Getting Help

  • Check integration logs
  • Review error messages
  • Consult API documentation
  • Contact support team
Integrations are the foundation for connecting your AI agents to real data sources, enabling powerful automated analysis and decision-making workflows.