
Search by job, company or skills

Responsibilities:
Build and maintain data/solution pipelines that ingested and match the new requirements, with AI support to accelerate data onboarding and quality checks.
Work in a technical team through development, deployment, and application of preparing, and optimizing the developing and growing Micron's methods and systems for extracting new insight for expanding data streams, driving AI-enabled analytics at scale.
Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data.
Design and optimize data structures in data management systems (Snowflake, and Google Cloud Platform) to enable Smart Manufacturing AI solutions, supporting AI-driven workloads and performance.
Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model.
Build custom software components and analytics applications, with AI support to enhance intelligence and automation.
Create/Maintain CI/CD pipelines of data engineering solutions in the cloud, driving AI-informed testing, monitoring, and release automation.
Use AI to complete work more efficiently, empowering AI applications and teams to achieve faster, smarter outcomes.
Qualifications & Skills:
Experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions.
Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.)
Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake, GCP Big Query or any Equivalent Data Warehouse.
Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc.
Understanding of how distributed systems work.
Familiarity with software architecture (data structures, data schemas, etc.)
Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL.
Strong mathematics background, analytical, problem solving, and organizational skills.
Knowledge in building APIs for application integration
Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical Data Model
Experience with continuous integration/continuous delivery (CI/CD) tools (Jenkins, Git, Docker, Kubernetes)
Outstanding analytical thinking, interpersonal, oral and written communication skills
Ability to prioritize and meet critical project timelines in a fast-paced environment
Self-motivated and team oriented
Education and Experience:
B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred.
Minimum of 3 years experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.) at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.) one or more Data Extraction Tools (SSIS, Informatica/Apache Nifi or equivalent etc.)
Software development skills and the desire to work on cutting edge development in a Cloud environment
Ability to travel as needed
Establishes database management systems, standards, guidelines and quality assurance for database deliverables, such as conceptual design, logical database, capacity planning, external data interface specification, data loading plan, data maintenance plan and security policy. Documents and communicates database design. Evaluates and installs database management systems. Codes complex programs and derives logical processes on technical platforms. Builds windows, screens and reports. Assists in the design of user interface and business application prototypes. Participates in quality assurance and develops test application code in client server environment. Provides expertise in devising, negotiating and defending the tables and fields provided in the database. Adapts business requirements, developed by modeling/development staff and systems engineers, and develops the data, database specifications, and table and element attributes for an application. At more experienced levels, helps to develop an understanding of client's original data and storage mechanisms. Determines appropriateness of data for storage and optimum storage organization. Determines how tables relate to each other and how fields interact within the tables for a relational model.
Job ID: 145614505