Location: Remote — Argentina, Colombia, México, Chile, Uruguay Job Type: Full-time contract Date Posted: March, 2026
Project Brief
We're building a cloud-native data infrastructure powering AI-driven products — from ingestion pipelines and vector search to MCP-based integrations and multi-tenant architectures. You'll own the design, deployment, and reliability of data pipelines end-to-end, working across AWS, Terraform, and Pinecone in a fast-moving environment where architectural decisions matter and ownership is real.
Must-Haves
- Strong Python skills — Lambda functions, scripting, API consumption, and data processing pipelines
- Hands-on AWS experience across: Lambda, S3, API Gateway, ECS/Fargate, Secrets Manager, Cognito, CloudWatch, and SQS + DLQ
- Terraform for infrastructure-as-code — remote state with S3 + DynamoDB, modular resource definitions, and multi-environment management
- Docker — containerization, Dockerfile hardening, and ECS task image builds
- Pinecone — index creation, namespace design for multi-tenancy, metadata schema, upsert logic, and hybrid search configuration
- Experience with embedding models and ETL/data transformation pipelines
- MCP protocol experience — FastMCP SDK, Streamable HTTP transport, tool definition
- REST API consumption — Cognito-authenticated requests, response parsing, and S3 persistence
- Git/GitHub — trunk-based branching, semantic PRs, and branch-per-feature workflows
- JSON schema design for pipeline service contracts and API response structures
- Retry and error handling patterns — exponential backoff and failure architecture
- Clear English communication and ability to work independently with minimal supervision
Nice to Have
- Incremental update and change detection logic for pipeline efficiency
- CloudWatch alarm design and pipeline health dashboards
- Content-hash deduplication for embedding cost optimization
- Lambda memory and concurrency tuning
- VPN site-to-site configuration
- LangChain / LangGraph experience
- Prompt engineering fundamentals
- GitHub Actions — CI/CD pipeline setup and deployment automation
- Marketing Business Acumen
What You Will Do
- Design, build, and maintain cloud-native data pipelines across AWS Lambda, S3, ECS/Fargate, and SQS
- Manage and evolve infrastructure-as-code using Terraform across multiple environments
- Build and optimize vector search workflows in Pinecone — including multi-tenant namespace design, metadata schemas, and hybrid search
- Develop and improve MCP server integrations using FastMCP SDK and Streamable HTTP transport
- Implement and harden authentication flows using AWS Cognito and Secrets Manager
- Monitor pipeline health via CloudWatch — log groups, alarms, and observability for Lambda and ECS
- Apply cost modeling discipline — flag and mitigate high-cost operations (e.g. embedding calls at scale)
- Work within an established SA governance model, contributing to architectural decisions with clear documentation and standards
- Collaborate with engineering and applied science teams to deliver reliable, scalable data infrastructure
Why This Could Be Your Next Big Move
- 🧱 Own the data foundation — You're not maintaining legacy systems. You're designing the infrastructure that AI products are built on top of.
- 🤖 Work at the AI/data intersection — Vector search, embeddings, MCP integrations, and multi-tenant architectures. This is where data engineering and AI meet.
- ☁️ Deep AWS stack — Lambda, ECS, Cognito, SQS, Secrets Manager and more — a real opportunity to broaden and deepen your cloud expertise across a full production stack.
- 🔝 High ownership, high impact — Fast-moving environment where your architectural decisions shape the product directly, with visibility across the entire engineering org.
Benefits & Compensation
- 💵 $[3,500] – $[5,000]/month — paid in USD, bi-weekly via Deel
- 🌎 Fully Remote — work from anywhere in Latin America
- 📄 Full-time contract with a U.S. company — 6-month initial term, full-time conversion
- 🏖️ Paid PTO — competitive package, grows with tenure
- 🤝 Referral Program — earn a bonus for referring talent that gets hired
To Apply
Please send your resume in English.
Include the following depending on your role:
- 🔗 LinkedIn Profile URL (required)
- 💻 GitHub Portfolio or code samples — show us your pipeline and infrastructure work (required)
- ✉️ Cover Letter — tell us about a data pipeline or cloud architecture you built and the impact it had (optional but encouraged)
About OneSeven Tech
Founded by James Sullivan, OneSeven Tech is a premier digital product agency serving startups and enterprises. Our clients have collectively raised over $100M in VC, and our enterprise partners include 2,000+ person hospitality groups and NASDAQ-listed companies.
We work fully remote, move fast with 1-week sprints, and focus on elegant solutions to complex problems.