IC
Software Engineer
Accepting applicationsInfinite Computer Solutions · Campus, IL
Full-Time Mid AIAiPythonaiate
Posted
6d ago
Category
Test
Experience
Mid
Country
United States
Job Description
Primary Skills - Oracle SQL, Python, PySpark, Snowflake, ReactJS
Claude (good to have for building agentic workflows)
Job Title: Full Stack Data Engineer (Python, PySpark, Snowflake, ReactJS)
About The Role
We are seeking a highly skilled and motivated Full Stack Data Engineer to join our growing team. This role combines strong data engineering expertise with frontend capabilities to build scalable data platforms and user-centric applications. You will work across the stack—from data pipelines to interactive dashboards—leveraging modern technologies and cloud-based architectures.
Key Responsibilities
Design, develop, and optimize complex queries and data models using Oracle SQL.
Build scalable data pipelines using Python and PySpark for batch and real-time processing.
Develop and manage data warehousing solutions on Snowflake, ensuring performance, security, and cost optimization.
Create responsive and dynamic user interfaces using ReactJS to visualize data insights.
Collaborate with cross-functional teams including data scientists, analysts, and product managers to deliver end-to-end solutions.
Ensure data quality, integrity, and governance across all pipelines and systems.
Implement best practices for code quality, testing, and CI/CD.
Monitor and troubleshoot data workflows and application performance.
Required Skills & Qualifications
Strong proficiency in Oracle SQL, including query optimization and performance tuning.
Hands-on experience with Python and distributed data processing using PySpark.
Solid experience working with Snowflake (data modeling, SnowSQL, performance tuning).
Frontend development experience using ReactJS, including state management and API integration.
Understanding of ETL/ELT frameworks and data pipeline architecture.
Experience with version control systems such as Git.
Strong problem-solving skills and attention to detail.
Good to Have
Experience with Claude (Anthropic) or similar LLMs for building agentic workflows and intelligent automation.
Familiarity with cloud platforms such as AWS, Azure, or GCP.
Knowledge of containerization tools like Docker and orchestration frameworks like Kubernetes.
Exposure to REST APIs and microservices architecture.
Experience with workflow orchestration tools (e.g., Airflow).
What We Offer
Opportunity to work on cutting-edge data and AI-driven solutions.
Collaborative and innovative work environment.
Competitive compensation and benefits.
Continuous learning and professional development opportunities.
Qualifications
B.E/B Tech
Range Of Year Experience-Min Year
3
Range Of Year Experience-Max Year
5
Show more Show less
Primary Skills - Oracle SQL, Python, PySpark, Snowflake, ReactJS
Claude (good to have for building agentic workflows)
Job Title: Full Stack Data Engineer (Python, PySpark, Snowflake, ReactJS)
About The Role
We are seeking a highly skilled and motivated Full Stack Data Engineer to join our growing team. This role combines strong data engineering expertise with frontend capabilities to build scalable data platforms and user-centric applications. You will work across the stack—from data pipelines to interactive dashboards—leveraging modern technologies and cloud-based architectures.
Key Responsibilities
Design, develop, and optimize complex queries and data models using Oracle SQL.
Build scalable data pipelines using Python and PySpark for batch and real-time processing.
Develop and manage data warehousing solutions on Snowflake, ensuring performance, security, and cost optimization.
Create responsive and dynamic user interfaces using ReactJS to visualize data insights.
Collaborate with cross-functional teams including data scientists, analysts, and product managers to deliver end-to-end solutions.
Ensure data quality, integrity, and governance across all pipelines and systems.
Implement best practices for code quality, testing, and CI/CD.
Monitor and troubleshoot data workflows and application performance.
Required Skills & Qualifications
Strong proficiency in Oracle SQL, including query optimization and performance tuning.
Hands-on experience with Python and distributed data processing using PySpark.
Solid experience working with Snowflake (data modeling, SnowSQL, performance tuning).
Frontend development experience using ReactJS, including state management and API integration.
Understanding of ETL/ELT frameworks and data pipeline architecture.
Experience with version control systems such as Git.
Strong problem-solving skills and attention to detail.
Good to Have
Experience with Claude (Anthropic) or similar LLMs for building agentic workflows and intelligent automation.
Familiarity with cloud platforms such as AWS, Azure, or GCP.
Knowledge of containerization tools like Docker and orchestration frameworks like Kubernetes.
Exposure to REST APIs and microservices architecture.
Experience with workflow orchestration tools (e.g., Airflow).
What We Offer
Opportunity to work on cutting-edge data and AI-driven solutions.
Collaborative and innovative work environment.
Competitive compensation and benefits.
Continuous learning and professional development opportunities.
Qualifications
B.E/B Tech
Range Of Year Experience-Min Year
3
Range Of Year Experience-Max Year
5
Show more Show less
Similar Jobs
M
HBM PE DFT
Micron · Boise, United States, North America
N
Test Engineer - Photonic
NVIDIA · Roskilde, Denmark, Europe
N
Lead Engineer, Healthcare Data Operations and Strategy
NVIDIA · Santa Clara, United States, North America
AM
Administrative Assistant – Categorie Protette L.68/99
Applied Materials · Treviso, Italy, Europe