
Data Engineer Job Openings in Bangalore 2026!!!
IBM announced job vacancy for the post of Data Engineer – Data Platforms – Google.The place of posting will be at Bangalore. Candidates who have completed Graduate / Engineering / Post Graduate with Fresher / Experience are eligible to apply. More details about qualifications, job description and roles & responsibilities are as follows
Company Overview
| Name of the Company | IBM |
| Required Qualifications | Graduate / Post Graduate |
| Skills | Apache Airflow, dbt, Spark/Python, or Spark/Scala |
| Category | Data & Analytics |
| Work Type | Onsite |
In this role, you’ll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Their delivery centers offer their clients locally based skills and technical expertise to drive innovation and adoption of new technology.
Job Details
Θ Positions: Data Engineer – Data Platforms – Google
Θ Job Location: Bangalore
Θ Salary: As per company standards
Θ Job Type: Full Time
Θ Requisition ID: 85936
Roles and Responsibilities:
As a Data Engineer specializing in Google’s data platforms, you will design, build, and maintain data engineering solutions on Google’s Cloud ecosystem. This role requires expertise in utilizing various Google services for batch and real-time data pipelines, data migration, and data layer design. Your primary responsibilities will include:
- Design Data Pipelines: Design and develop batch and real-time data pipelines for Data Warehouse and Datalake using Google services such as DataProc, DataFlow, PubSub, BigQuery, and Big Table.
- Develop Data Engineering Solutions: Utilize Google Cloud Storage, BigTable, BigQuery DataProc with Spark and Hadoop, and Google DataFlow with Apache Beam or Python to build and maintain data engineering solutions.
- Manage Data Platforms: Schedule and manage the data platform using Google Cloud Scheduler and Cloud Composer (Airflow), ensuring efficient data pipeline operations.
- Implement Data Migration: Develop and implement data migration solutions using Google services, ensuring seamless data transfer between systems.
- Optimize Data Layer: Design and optimize the data layer using Google services such as BigQuery, Big Table, and Cloud Spanner, ensuring efficient data storage and retrieval.
Required Skills & Qualifications:
- Required education: Bachelor’s Degree
- Preferred education: Master’s Degree
- Google Cloud Ecosystem Expertise: Exposure to designing, building, and maintaining data engineering solutions on Google’s Cloud ecosystem, including services such as Google DataProc, DataFlow, PubSub, BigQuery, Big Table, Cloud Spanner, CloudSQL, and AlloyDB.
- Data Pipeline Development Experience: Exposure to developing and managing batch and real-time data pipelines for Data Warehouse and Datalake using Google services and open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala.
- Google Cloud Services Proficiency: Experience working with Google Cloud Storage, BigTable, BigQuery DataProc with Spark and Hadoop, and Google DataFlow with Apache Beam or Python to build and maintain data engineering solutions.
- Data Platform Management Knowledge: Exposure to scheduling and managing the data platform using Google Cloud Scheduler and Cloud Composer (Airflow) for efficient data pipeline operations.
- Data Layer Design Understanding: Experience working with data layer design using Google services such as BigQuery, Big Table, and Cloud Spanner for efficient data storage and retrieval.
Preferred technical and professional experience
- Open-Source Technologies: Exposure to utilizing open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala for developing and managing batch and real-time data pipelines.
- Data Migration Solutions: Experience working with Google services to develop and implement data migration solutions, ensuring seamless data transfer between systems.
- Cloud Composer Expertise: Exposure to using Cloud Composer (Airflow) for scheduling and managing the data platform, ensuring efficient data pipeline operations.
How to Apply
Apply Link – Click Here
For Regular Updates Join our WhatsApp – Click Here
For Regular Updates Join our Telegram – Click Here
Disclaimer:
The information provided on this page is intended solely for informational purposes for Students, Freshers & Experience candidates. All the recruitment details are sourced directly from the official website and pages of the respective company. Latest MNC Jobs do not guarantee job placement, and the recruitment process will follow the company’s official rules and Human Resource guidelines. Latest MNC Jobs do not charge any fees for sharing job information. Latest MNC Jobs strongly advise Students, Freshers & Experience candidates not to make any payments for any job opportunities.