ROLE: AWS Lead
We are Cognizant Artificial Intelligence
Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. However, clients need new business models built from analyzing customers and business operations at every angle to really understand them.
With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate and scale the most desirable products and delivery models to enterprise scale within weeks.
This position open to any qualified applicant in the United States.
Qualification:
Bachelor - in science, engineering or equivalent
Salary and Other Compensation:
The base annual salary for this position is between $[110– 130,000+] depending on experience and other qualifications of the successful candidate. Applications will be accepted until 10/2/2024.
This position is also eligible for Cognizant’s discretionary annual incentive program and stock awards, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
- Medical/Dental/Vision/Life Insurance
- Paid holidays plus Paid Time Off
- 401(k) plan and contributions
- Long-term/Short-term Disability
- Paid Parental Leave
- Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
Job title: AWS Data Engineer (Remote)
Experience: 8-12 + Years
Must have skills: AWS
Responsibilities:
We are seeking an experienced AWS Cloud Engineer to join our team. The ideal candidate will have extensive expertise in AWS DevOps AWS IAM Amazon Redshift Amazon Dynamo DB Python AWS Glue Studio AWS Glue ETL AWS Glue Catalog Amazon S3 MWAA (Airflow) and Apache Spark.
Responsibilities
- Lead the design and implementation of scalable cloud-based architectures using AWS services.
- Oversee the development and deployment of data pipelines utilizing MWAA (Airflow) and AWS Glue ETL.
- Provide expertise in AWS DevOps to streamline CI/CD processes and automate infrastructure provisioning.
- Manage and secure AWS IAM roles and policies to ensure robust access control and compliance.
- Optimize data storage and retrieval processes using Amazon Redshift and Amazon Dynamo DB.
- Develop and maintain Python scripts for data processing and automation tasks.
- Utilize AWS Glue Studio to create and manage ETL workflows and data transformations.
- Implement and manage AWS Glue Catalog to maintain a centralized metadata repository.
- Ensure efficient data storage and retrieval from Amazon S3 leveraging best practices for data management.
- Integrate Apache Spark for large-scale data processing and analytics.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Monitor and troubleshoot system performance ensuring high availability and reliability of cloud services.
- Stay updated with the latest AWS technologies and best practices to continuously improve our cloud infrastructure.
Qualifications
- Demonstrate proficiency in MWAA (Airflow) for orchestrating complex workflows.
- Show extensive experience in AWS DevOps for automating and managing cloud infrastructure.
- Exhibit strong knowledge of AWS IAM for secure access management.
- Have hands-on experience with Amazon Redshift and Amazon Dynamo DB for data storage solutions.
- Be skilled in Python for scripting and automation.
- Utilize AWS Glue Studio for creating and managing ETL workflows.
- Implement AWS Glue ETL for efficient data transformation processes.
- Manage AWS Glue Catalog for centralized metadata management.
- Optimize data storage and retrieval from Amazon S3.
- Integrate Apache Spark for large-scale data analytics.
- Collaborate effectively with cross-functional teams.
- Monitor and troubleshoot cloud infrastructure for high availability.
- Stay updated with AWS technologies and best practices.
Work mode: Remote
Work Location: Remote
#LI-DC1 #CB #Ind123