Leading Edge Skills is a focused IT and Business Consulting and Training company, based in California, right in the heart of Silicon Valley, that prepares students in transitioning and launching a new career quickly. We feel pride in offering a unique learning experience through training programs that are aligned with the industry requirements and are affordable.
Our professionals are the most valuable and respected asset for us. We make sure our professionals make use of their full potential to not only transform the skill set among the clients but make them feel comfortable when they come across unexpected and challenging scenarios they may face at the workplace.
We’re looking for humans with a passion, empathy, and accountability to prepare contents and train students for our training programs focused on various roles in:
EDI Solutions Engineer (Techno-Functional Lead)
Position summary (purpose)
Own the end-to-end health of EDI workflows—design, build, configure, and support data intake, transformation, validation, and delivery. Bridge business/operations and engineering to ensure EDI processes meet performance, accuracy, security, and availability targets.
Position summary (purpose)
Own the end-to-end health of EDI workflows—design, build, configure, and support data intake, transformation, validation, and delivery. Bridge business/operations and engineering to ensure EDI processes meet performance, accuracy, security, and availability targets.
Core responsibilities:
A. Solution Design & Implementation
- Translate program requirements, data dictionaries, and validation rules into technical designs and acceptance criteria.
- Design reusable transformation pipelines (e.g., CSV/XML/JSON/X12 → canonical → system-specific formats).
- Configure/extend pre- and post-processing utilities (e.g., parsing, cleansing, enrichment, QC hooks, audit logging).
- Build APIs, jobs, and scripts for file ingest, routing, and acknowledgments (batch and near-real-time).
- Establish error-handling and reprocessing patterns (idempotency, poison-queue handling, replay workflows).
B. Development & Configuration Management
- Develop enhancements and fixes (e.g., .NET/C#, Python, or Java; SQL/PL-SQL) with unit/integration tests.
- Version control, branching strategy, and CI/CD (e.g., GitHub/GitLab, pipelines, artifact repos).
- Maintain technical documentation: interface specs, mapping guides, sequence diagrams, runbooks.
C. Operations & Troubleshooting
- Serve as Tier-2/Tier-3 for EDI incidents; triage, root-cause, and resolve within SLAs.
- Monitor pipelines (metrics, logs, alerts); tune performance and throughput; eliminate recurring defects.
- Execute release management (sandbox → test → prod), change records, and rollback plans.
D. Data Quality, Security, and Compliance
- Implement validation rules, referential checks, and balancing controls; design QC dashboards.
- Enforce data handling standards (least privilege, encryption at rest/in transit, secure transfer).
- Support accessibility/records/retention requirements and audit requests with traceable logs.
E. Stakeholder Enablement
- Partner with analysts/SMEs to interpret anomalies and update business rules.
- Provide technical coaching to analysts/supervisors; create job aids and quick-start guides.
- Contribute to monthly performance reports with metrics and corrective-action status.
Required qualifications
- Education/Experience: Bachelor’s in CS/IS/Engineering (or equivalent experience); 5–8+ years in EDI, data integration, or enterprise systems with demonstrated techno-functional ownership.
- Languages/Platforms: Proficiency in .NET/C# (or Java) and SQL (Oracle/PostgreSQL/MS SQL). Scripting in Python or PowerShell for ops tooling.
- EDI/Integration: Hands-on with file and message standards (X12/EDIFACT, CSV, XML, JSON), mapping/transformation, schema validation (XSD/JSON Schema), and API patterns (REST/SOAP).
- Pipelines/Automation: ETL/ELT tools, schedulers, message brokers/queues, CI/CD, and containerization basics.
- Ops/Monitoring: Log aggregation (e.g., ELK/CloudWatch), alerting, and APM; comfort with Linux/Windows admin tasks.
- Methods: Requirements grooming, user stories, acceptance criteria, test-driven fixes, change control, and incident/problem management (ITIL concepts).
- Soft skills: Clear communicator with non-technical stakeholders; structured problem solver; documentation discipline.
Preferred qualifications
- Experience in public-sector data programs; familiarity with data confidentiality and controlled-access environments.
- Knowledge of accessibility and content standards in training/reporting deliverables.
- Exposure to secure file transfer (SFTP/AS2), PKI/cert management, and data-rights considerations.
- Performance tuning for large batch workloads (parallelization, partitioning, bulk operations).
Success metrics (KPIs)
- Timeliness: % incidents resolved within SLA; mean time to restore (MTTR).
- Quality: Defect escape rate to production; % successful first-pass loads; validation error rate.
- Performance: Throughput vs. target (files/hour or records/minute); average job duration.
- Stability: Failed-run rate; auto-recovery success rate; change failure rate after releases.
- Maintainability: Test coverage for mappings/utilities; documentation completeness (runbooks, mappings).
A. Solution Design & Implementation
- Translate program requirements, data dictionaries, and validation rules into technical designs and acceptance criteria.
- Design reusable transformation pipelines (e.g., CSV/XML/JSON/X12 → canonical → system-specific formats).
- Configure/extend pre- and post-processing utilities (e.g., parsing, cleansing, enrichment, QC hooks, audit logging).
- Build APIs, jobs, and scripts for file ingest, routing, and acknowledgments (batch and near-real-time).
- Establish error-handling and reprocessing patterns (idempotency, poison-queue handling, replay workflows).
B. Development & Configuration Management
- Develop enhancements and fixes (e.g., .NET/C#, Python, or Java; SQL/PL-SQL) with unit/integration tests.
- Version control, branching strategy, and CI/CD (e.g., GitHub/GitLab, pipelines, artifact repos).
- Maintain technical documentation: interface specs, mapping guides, sequence diagrams, runbooks.
C. Operations & Troubleshooting
- Serve as Tier-2/Tier-3 for EDI incidents; triage, root-cause, and resolve within SLAs.
- Monitor pipelines (metrics, logs, alerts); tune performance and throughput; eliminate recurring defects.
- Execute release management (sandbox → test → prod), change records, and rollback plans.
D. Data Quality, Security, and Compliance
- Implement validation rules, referential checks, and balancing controls; design QC dashboards.
- Enforce data handling standards (least privilege, encryption at rest/in transit, secure transfer).
- Support accessibility/records/retention requirements and audit requests with traceable logs.
E. Stakeholder Enablement
- Partner with analysts/SMEs to interpret anomalies and update business rules.
- Provide technical coaching to analysts/supervisors; create job aids and quick-start guides.
- Contribute to monthly performance reports with metrics and corrective-action status.
Required qualifications
- Education/Experience: Bachelor’s in CS/IS/Engineering (or equivalent experience); 10+ years in EDI, data integration, or enterprise systems with demonstrated techno-functional ownership.
- Languages/Platforms: Proficiency in .NET/C# (or Java) and SQL (Oracle/PostgreSQL/MS SQL). Scripting in Python or PowerShell for ops tooling.
- EDI/Integration: Hands-on with file and message standards (X12/EDIFACT, CSV, XML, JSON), mapping/transformation, schema validation (XSD/JSON Schema), and API patterns (REST/SOAP).
- Pipelines/Automation: ETL/ELT tools, schedulers, message brokers/queues, CI/CD, and containerization basics.
- Ops/Monitoring: Log aggregation (e.g., ELK/CloudWatch), alerting, and APM; comfort with Linux/Windows admin tasks.
- Methods: Requirements grooming, user stories, acceptance criteria, test-driven fixes, change control, and incident/problem management (ITIL concepts).
- Soft skills: Clear communicator with non-technical stakeholders; structured problem solver; documentation discipline.
Preferred qualifications
- Experience in public-sector data programs; familiarity with data confidentiality and controlled-access environments.
- Knowledge of accessibility and content standards in training/reporting deliverables.
- Exposure to secure file transfer (SFTP/AS2), PKI/cert management, and data-rights considerations.
- Performance tuning for large batch workloads (parallelization, partitioning, bulk operations).
Success metrics (KPIs)
- Timeliness: % incidents resolved within SLA; mean time to restore (MTTR).
- Quality: Defect escape rate to production; % successful first-pass loads; validation error rate.
- Performance: Throughput vs. target (files/hour or records/minute); average job duration.
- Stability: Failed-run rate; auto-recovery success rate; change failure rate after releases.
- Maintainability: Test coverage for mappings/utilities; documentation completeness (runbooks, mappings).
- Attitude required:
- Honesty, humility, and integrity
- Inclusive and respectful
- Strong work ethic
- Passion for learning
- Comfortable with autonomy
- Eager to add new skills and grow professionally
- Skills that are preferred, but not required:
- Curiosity
- Assertiveness
- A love of IT and/or technology
- Proficiency with Gmail, Google Docs, Slack, and internal LES tools.
Job Types: Part-time, Contract
Pay: $80.00 - $140.00 per hour
Work Location: Remote