Job Description
Job Description Data Engineer (P3/P4)
About the Company: Are you curious, motivated, and forward-thinking? At FIS you’ll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. FIS is a leading Fintech provider with wide range of Products and Services. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun.
About the role
We are seeking a Data Engineer with a strong foundation in Big data technology such as Hive and Snowflake, also experienced in AWS services and data ingestion using APIs.
This individual will be instrumental in building scalable data solutions mainly for AI and Machine Learning initiatives, leveraging AWS and Snowflake technologies to support our data infrastructure needs. Experience in Fintech Capital market space is an added advantage.
Experience: 5-8 years in AWS Data Engineering
What you will be doing
- Implement data lake and warehousing strategies to support analytics, AI and machine learning initiatives.
- Develop and maintain scalable and reliable data pipelines to ingest data from various APIs into the AWS ecosystem.
- Manage data storage solutions using S3 buckets, ensuring best practices in data organization and security.
- Hands-on on data warehousing tasks, optimizing data retrieval and query performance.
- Utilize EC2 instances for custom applications and services that require compute capacity.
- Collaborate with cross-functional teams to understand data needs and deliver solutions that align with business goals.
- Ensure compliance with data governance and security policies.
- Define strategies to leverage existing large datasets and integrate new datasets to extend product capabilities and work closely with the data scientist and product engineering teams in the development of data products.
- Identify relevant data sources and sets to mine for client business needs and collect large structured and unstructured datasets and variables.
- Mining big-data stores; perform data and error analysis; clean and validate data for uniformity and accuracy.
What you Bring:
- Expert in Python while experience in Airflow is good to have.
- Proficiency in Snowflake and AWS services.
- Proficiency in data ingestion and integration, particularly with APIs.
- Strong understanding of data warehousing, ETL processes, and cloud data storage.
- Strong experience with scripting languages such as Python for data manipulation and automation.
- Familiarity with infrastructure as code tools for managing AWS resources.
- Excellent problem-solving skills and ability to work in a dynamic environment.
- Strong communication skills for effective collaboration and documentation.
- Expertise in building analytics models, strong problem-solving ability, and a knack for statistical analysis.
- Experience querying databases and using statistical computer languages.
- Advanced knowledge of data analysis, cleaning, and preparation.
- Experience building cloud-based solution with AWS sagemaker and Snowflake.
- Strong Communication Skills - Listening, Comprehending, Speaking.
Privacy Statement
FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice.
Sourcing Model
Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company.
#pridepass