Job Title: Senior
Advisor-Data Engineering
Experience: 8-10 Years
Notice period: Immediate to 15 days
Location: Bangalore
JD:
Requirements:
·
8+ Years of
experience in Data Engineering/ETL/Dashboarding/Data warehousing
· Hands on experience on both structured and
unstructured data
· 5+ years hands on experience in the below tools:
· Spark, Java, Iceberg, Kafka, ECS, Python
· StreamSets, Informatica, GPSS, TDAM, ANSI SQL, UNIX
· Teradata, Greenplum, Oracle
Responsibilities:
· Experience in supporting large scale data
engineering projects
· Leads or participates in the systems software
development lifecycle, which includes research, new development, modification,
security, correction of errors, reuse, re-engineering and maintenance of software
products and data pipelines
· Develops technical tools and programming to cleanse,
organize and transform data and to maintain, protect and update data structures
and integrity on an automated basis.
· Design & develop processes, automations and
systems to capture, manage, store and utilize structured and unstructured data
to generate actionable insights and solutions
· Responsible for the maintenance, improvement,
cleaning, and manipulation of data in the data lake.
· Attend to data job failures and data issues reported
by users
· Proactively analyzes and evaluates the data pipeline
failures and triage issues appropriately based on the SLAs
· Work with engineering teams to review solutions for
operational ease & are built per standards.
· Works in collaboration with data engineers, data
scientists, architects and business users to support the data needs and create
actionable insights to identify data quality issues.
· Designs, codes, tests and debugs software according
to standards, policies and procedures.
· Mentoring junior team members on technical and
functional skills. Should be a great team player. Functional knowledge of
business processes is required.
Skills:
· Possesses and
applies a broad knowledge of application programming processes and procedures
to the completion of complex assignments.
· Competent to
analyze diverse and complex problems.
· Advanced
ability to effectively troubleshoot program errors.
· Build high
reliability, high quality, high volume data pipelines
· Setup batch,
micro batch, streaming pipelines
· Data
ingestion, transformation, processing batch & near real time
· Automated
tests and tie outs, self-healing data jobs
· Build
products that can support themselves with none to minimal support after rollout
· Ability to
communicate complex insights in a precise and actionable manner
· Mindset to
think differently; alignment to Industry standards; awareness of emerging
technologies and industry trends