Description
Overview of the Role
Global supply chains still rely on slow, manual processes—email, spreadsheets, and fragmented data. The economy moves fast but supply chains don’t, creating an inefficiency that affects the $13T of goods shipped annually and is one of the largest untapped opportunities in modern enterprise.
Agentforce for Supply Chain is reimagining the supply chain with an AI-powered platform for designing, automating, and running end-to-end business processes, with seamless collaboration through familiar channels like email. For Salesforce, this represents a massive growth opportunity in the back office, with innovations that flow into the front office. Customers are clamoring for more, rapidly expanding their use cases as we enter an exhilarating growth phase. As one user put it: “I’ve been waiting for this for 20 years.”
We are seeking a highly experienced and impactful Lead Member of Technical Staff (LMTS) or Principal Member of Technical Staff (PMTS) to serve as a key individual contributor within the Integration & Platform team. This role is crucial for designing, building, and scaling the mission-critical data and integration backbone of our platform. The ideal candidate will possess deep expertise in building robust, scalable data ingestion and export pipelines, integrating custom services with enterprise platforms, and simplifying complex customer APIs.
Key focus areas include:
Platform Integration: Architecting and implementing powerful integrations between our core platform and Salesforce ecosystem products, specifically Mulesoft and Flow.
Data Export & Scale: Leading the design and implementation of highly scalable, efficient, and reliable data export pipelines. This includes integrating our internal "data sources" (CSV files, REST API outputs) with Salesforce Data Cloud and/or Informatica.
ETL Automation & Reliability: Optimizing our existing ETL pipeline infrastructure to support a daily, self-serve customer data synchronization mechanism with Data Cloud. This solution must be fully automated, requiring minimal engineering involvement post-initial setup, and demonstrate exceptional robustness, scalability, reliability, and performance.
Customer API Abstraction: Designing and building simplified, performant, customer-facing APIs and webhooks. This includes creating an abstracted version of high-value functionality currently exposed via complex GraphQL API endpoints, focusing on a minimal, simplified interface for core customer workflows.
Distributed Systems Expertise: Deep understanding and practical experience with event queues, asynchronous task processing, and designing highly available, geographically distributed systems.
Responsibilities
Day-to-Day & Project Execution
Architect, design, and deliver high-quality, scalable code for complex data integration and platform features.
Drive technical decision-making and project execution as the primary Individual Contributor for integration initiatives.
Conduct thorough code reviews and mentor other engineers on best practices for performance, security, and scalability.
Collaborate cross-functionally with Product Management and other engineering teams to define requirements and deliver solutions that meet business needs.
Long-Term Goals & Strategic Impact
Set the technical direction and standards for all data integration and ETL processes within the platform.
Identify and mitigate architectural risks associated with scaling our data infrastructure to support exponential customer growth.
Drive continuous improvement in system performance, observability, and operational efficiency.
Experience Gained & Impact Made
Experience Gained: The candidate will gain unparalleled experience leading the transformation of mission-critical data systems at enterprise scale, directly influencing the architecture of Salesforce's next-generation integration platform utilizing tools like Mulesoft and Data Cloud.
Impact Made: This role is critical to enabling customer success by ensuring a reliable, performant, and autonomous data sync experience. The work directly forms the foundational data layer powering core customer intelligence and platform functionality.
Required Qualifications
8+ years of professional software development experience, with a significant focus on large-scale data systems, ETL, or integration platforms
A related technical degree required
Demonstrated expertise in designing, building, and maintaining production-grade data pipelines (ETL/ELT)
Expert-level proficiency in at least one modern programming language (e.g., Java, Python, Go) suitable for platform development.
Proven experience designing and implementing robust, external-facing REST APIs and Webhooks
Deep practical knowledge of building and operating distributed systems, concurrency, and high-availability architectures
Experience in a lead technical role (LMTS/PMTS level or equivalent), driving and owning complex projects as an Individual Contributor
Excellent written and verbal communication skills
Preferred Qualifications
Hands-on experience with Salesforce integration technologies such as Mulesoft or Salesforce Data Cloud
Experience working with large-scale event streaming platforms (e.g., Kafka, Kinesis)
Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes)
Experience designing or working with GraphQL API implementations
Advanced degree in Computer Science or a related technical field
Published works including contributions to open-source projects, patents, or papers
Exposure to the supply chain, logistics, or manufacturing industry
