Position Type: Contract, Full-Time, Remote
Working Hours: CST
About Pavago:
Pavago is hiring a Database Engineer to support an innovative project involving the integration of advanced algorithms and dynamic user interactions for our client. The project emphasizes the analysis and visualization of large-scale engineering datasets, requiring cutting-edge expertise in database architecture and optimization. If you thrive in tackling complex data challenges and are passionate about enhancing efficiency and scalability, we’d love to hear from you!
Key Responsibilities:
- Database Architecture Design: Create, optimize, and enhance database systems to efficiently manage and process large-scale data operations.
- Data Pipeline Development: Implement robust pipelines for seamless data exchange between algorithms and the database.
- Performance Optimization: Enhance database performance by optimizing SQL queries, indexing, and architecture.
- Real-Time Interaction Enablement: Collaborate with the development team to ensure real-time interactivity between users and graph data.
- Algorithm Refinement: Assist in refining proprietary algorithms to improve statistical and economic insights.
- Future ML Integration: Contribute to the integration of machine learning frameworks for enhanced data processing and analysis.
Requirements:
- Technical Proficiency: Strong knowledge of SQL, C#, and Python for optimizing databases and integrating algorithms, with familiarity in big data technologies, cloud services (Azure, MongoDB, NoSQL), and machine learning frameworks like TensorFlow or PyTorch.
- Skills: Exceptional analytical and problem-solving abilities, strong communication skills, proficiency with project management tools (Asana, Azure Boards, Slack, or Jira), and the ability to work both independently and collaboratively within a team.
- Experience: 5-6 Years of proven track record in database design and optimization, developing data pipelines for large-scale operations, and working in fast-paced environments with cross-functional teams.
What Makes You a Perfect Candidate?
- SQL Expertise: You have strong expertise in SQL and database optimization.
- Programming Proficiency: You are proficient in C# and Python for database integration.
- Technological Familiarity: You are experienced with cloud platforms, big data technologies, and AI tools.
- Problem-Solving Skills: You excel at analytical thinking and problem-solving.
- Team Collaboration: You thrive in both collaborative and independent work environments.
What Does a Typical Day Look Like?
Your day will involve designing and optimizing database systems to handle massive datasets, implementing efficient data pipelines, and collaborating with the development team to support real-time user interactions. You will analyze and refine algorithms, ensuring high performance and scalability, while contributing to the integration of machine learning models for advanced data processing.
Interview Process:
- Initial Phone Call: Discuss your experience and suitability for the role.
- Technical Test: You will have a technical test that will be required to assess your technical proficiency.
- Zoom Call Interview: Explore your technical expertise and professional background in a 30-minute session.
- Final Interview: Meet with our client to confirm alignment with project needs.
- Background Checks: Verify references and past employment.
Ready to Apply?
If this sounds like the perfect opportunity for you, we’d love to hear from you! Submit your application today and take the first step in joining a cutting-edge project at the intersection of data optimization and innovation.