Пълно описание
We are Expert Allies, an employee-oriented company specializing in outsourcing and providing our clients with the best specialists in the software industry. We are keen to invest in our colleagues to satisfy their needs, educate them, and continue their path to grow and develop their skills.
For one of our clients — a global leader in providing fundamental analysis, data, and advisory services across the energy markets, including oil, natural gas, LNG, NGLs, and emerging energy markets. — we are looking for an AI Engineer.
This role involves working with cutting-edge technologies such as Python, AWS, and Amazon Bedrock to develop high-quality web scraping solutions. Given that Bedrock is a relatively new technology, all specialists will undergo additional training to familiarize themselves with its specific use cases.
For one of our clients — a global leader in providing fundamental analysis, data, and advisory services across the energy markets, including oil, natural gas, LNG, NGLs, and emerging energy markets. — we are looking for an AI Engineer.
This role involves working with cutting-edge technologies such as Python, AWS, and Amazon Bedrock to develop high-quality web scraping solutions. Given that Bedrock is a relatively new technology, all specialists will undergo additional training to familiarize themselves with its specific use cases.
Your key responsibilities will involve:
- Developing and maintaining scalable web scraping solutions to extract, process, and manage large volumes of data.
- Building and optimizing data pipelines using Python and AWS services.
- Collaborating with AI Engineers and Data Scientists to enhance automation and data collection strategies.
- Working with Amazon Bedrock to integrate AI-driven insights into scraping processes (training provided).
- Ensuring the reliability, efficiency, and scalability of web scraping tools and techniques.
- Staying up to date with the latest advancements in web scraping, AI, and cloud computing.
- Strong proficiency in Python for web scraping and automation.
- Experience with AWS services (Lambda, S3, EC2, etc.).
- Familiarity with Amazon Bedrock or willingness to learn.
- Hands-on experience with web scraping frameworks such as Scrapy, BeautifulSoup, or Selenium.
- Knowledge of data processing, storage, and APIs.
- Understanding of scalability, security, and ethical considerations in web scraping.