Key Responsibilities
MuleSoft Integrations
- Design and configure Scheduler flows for time-triggered data synchronization
- Build time zone-aware, dynamic scheduler jobs using properties-based configuration
- Implement chained flows calling multiple downstream systems
- Monitor and audit scheduled runs for debugging and traceability
Webhooks & Event-Driven Integrations
- Configure public-facing HTTP Listener endpoints with robust security and validations
- Handle webhook payloads asynchronously with deduplication and queuing logic
- Parse and validate headers, content types, and payload signatures
Data Transformation
- Use DataWeave to transform data between JSON, XML, CSV, Java objects
- Handle complex nested mappings and reusable transformation functions
Batch Processing
- Develop scalable batch jobs for file and data processing
- Integrate batch processing flows with cloud storage systems
Connector Integrations
- Utilize connectors for AWS SQS, MongoDB, PostgreSQL, Airtable, and Slack
- Build resilient integrations with logging, retries, and monitoring using Anypoint Monitoring or Datadog
- Write custom loggers and error-handling strategies
Technical Environment & Tools
- Anypoint Platform – Studio, API Manager, Runtime Manager, Exchange
- DataWeave, MUnit, Postman, CloudHub, AWS
- MongoDB: Compass, Shell, Robo 3T
- PostgreSQL: pgAdmin, DBeaver, psql
- Airtable: REST API, JavaScript, OAuth
Required Skills & Experience
- 3+ years of hands-on MuleSoft development experience
- Strong experience with Scheduler jobs, batch processing, webhooks, and data transformations
- Proficiency in designing and integrating with MongoDB, PostgreSQL, and REST APIs
- Strong SQL, JSON, XML transformation skills
- Ability to build and scale integrations using MuleSoft connectors
Nice to Have
- Experience with MongoDB aggregation pipelines and ETL workflows
- Integration experience with Airtable (views, automations, REST API, scripting)
- Familiarity with OAuth, rate limiting, and building real-time sync with databases