Google Data Engineer

Shift: 11 AM – 8 PM IST

Location: Remote

No of Positions: 1

Job Type: 3+ Months Contract, can consider freelancers if they are committed full 8 hours.

Job Overview:

We are looking for a Google Data Engineer to join our dynamic team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and processing systems on Google Cloud Platform (GCP). You will work with large datasets, optimize data workflows, and implement best practices for data storage, security, and real-time analytics.

Key Responsibilities:

  • Work with large datasets to solve complex analytical problems.
  • Conduct end-to-end data analyses, including collection, processing, and visualization.
  • Develop and optimize Google Cloud data structures and workflows.
  • Identify patterns and trends in data and develop algorithms to enhance data usability.
  • Design, build, operate, and secure data processing systems on Google Cloud Platform (GCP).
  • Maintain ETL pipelines that handle both structured and unstructured data sources.
  • Optimize SQL queries and improve database performance.
  • Work with Hadoop and Kafka to handle large-scale data processing.
  • Automate manual processes and enhance system efficiency.
  • Design and develop data warehouses and distributed systems.
  • Ensure data security, governance, and compliance in all data operations.
  • Collaborate with cross-functional teams, including data scientists, software engineers, and business stakeholders, to develop data-driven solutions.

Required Technical Skills:

  • Strong expertise in SQL databases with the ability to write and optimize complex queries.
  • Experience in data warehousing, data modelling, and ETL pipelines.
  • Understanding of UNIX/Linux environments.
  • Hands-on experience with Kafka for real-time data streaming.
  • Knowledge of data structures and algorithms to optimize data processing.
  • Familiarity with Google Cloud Platform (BigQuery, Dataform, Dataflow, Pub/Sub, Cloud Storage, etc.).
  • Experience in machine learning model deployment is a plus.

Preferred Qualifications:

  • Google Professional Data Engineer Certification.
  • Bachelor/Degree in Computer Applications.
  • Experience with distributed computing frameworks like Hadoop, Spark.
  • Knowledge of CI/CD pipelines for data engineering workflows.
  • Strong analytical and problem-solving skills.

    Apply / Refer to Vertex


    Candidate details

    [supported files: pdf, doc, jpg, png & max file-size: 5mb]

    loader
    Vertex Computer Systems is Hiring!Join the Team »
    +