Job title: Data Engineer – Consultant or Sr. Consultant
At Deloitte, we do not offer you just a job, but a career in the highly sought-after risk Management field. We are one of the business leaders in the risk market. We work with a vision to make the world more prosperous, trustworthy, and safe. Deloitte’s clients, primarily based outside of India, are large, complex organizations that constantly evolve and innovate to build better products and services. In the process, they encounter various risks and the work we do to help them address these risks is increasingly important to their success—and to the strength of the economy and public security.
By joining us, you will get to work with diverse teams of professionals who design, manage, and implement risk-centric solutions across a variety of domains. In the process, you will gain exposure to the risk-centric challenges faced in today’s world by organizations across a range of industry sectors and become subject matter experts in those areas.
Our Risk and Financial Advisory services professionals help organizations effectively navigate business risks and opportunities—from strategic, reputation, and financial risks to operational, cyber, and regulatory risks—to gain competitive advantage. We apply our experience in ongoing business operations and corporate lifecycle events to help clients become stronger and more resilient. Our market-leading teams help clients embrace complexity to accelerate performance, disrupt through innovation, and lead in their industries. We use cutting-edge technology like AI/ML techniques, analytics, and RPA to solve.
Deloitte’s clients‘most complex issues. Working in Risk and Financial Advisory at Deloitte US-India offices has the power to redefine your ambitions.
Regulatory & Legal Services
We help organizations in their efforts to achieve regulatory and legal compliance and transform their departments to add greater value to the business. We combine highly specialized skills in discovery and data management, corporate investigations, Foreign Corrupt Practices Act, and anti-fraud with financial acumen and advanced analytics to produce transformative insights.
In today’s global marketplace, organizations can become vulnerable to critical incidents that include international corruption, financial crime, enterprise fraud, cybercrime, and supply chain breakdowns. Utilizing market-leading technology to uncover latent possibilities, our team advises clients on ways to mitigate exposure to these threats and turn business issues into opportunities for growth, resilience, and long-term advantage.
Our team is made up of software engineers, data engineers, and data scientists collaborating to build the best world class fraud detection analytics solutions and case management web application.
Work you’ll do
The Deloitte Managed Services & Products practice launched a new service in the forensics space to detect fraud, waste, and abuse in various industries. The Anti-Fraud Waste and Abuse solution (“AFES”) continues to increase the capacity, quality, and efficiency of the Deloitte forensic processing by utilizing state-of-the-art technology and machine learning in a dedicated environment. We have an urgent need for a Data Engineer to join the Deloitte Analytic & Forensic Technology practice and extend our application and contribute to our cutting-edge data architecture. Here are just some to the things you will do:
- Develop solutions with an Agile Development team.
- Define, produce, test, review, and debug solutions.
- Create component-based features and micro-frontends.
- Database development with Postgres
- Create comprehensive unit test coverage in all layers.
- Deploy solutions to Docker containers and Kubernetes.
- Help build a team culture of autonomy and ownership.
- Work with a Product Owner to refine stories into functional use cases and identify the work effort as tasks.
- Participate in test case creation responsibilities and peer reviews prior to coding.
- Review implementation plans of peers prior to their coding.
- Demonstrate feature work at the end of each iteration.
- Work from home when desired with infrequent visits to the office and limited travel for planning sessions.
- Develop our ETL process to be a robust automated production quality solution and lead the implementation and delivery.
- Peer with the application engineering team to ensure our data model fits the need of the solution while promoting best practices in its design from both a maintenance and performance perspective.
- Peer with the data science team in understanding their needs for preparing large datasets for machine learning.
- Assist the team with understanding the execution plan of poorly written queries. Help remediate performance problems by assisting the performance tuning of queries and/or refining the data model to meet the needs of the business.
- Build data systems and pipelines.
- Evaluate business needs and objectives.
- Explore ways to enhance the Product/pipeline.
- Collaborate with Team
- Showcase the Skills / Innovative ideas to the Team biweekly.
- Strong knowledge of Python & SQL.
- Hands-on experience with SQL database design
- Hands-on experience or Knowledge about Airflow.
- Knowledge of Docker and Kubernetes
- Experience with running containerized microservices.
- Experience with Apache Spark or AWS EMR.
- Experience with cloud platforms (AWS, Azure) with strong preference towards AWS.
- Experience in Database design practices
- Technical expertise with Data warehouse or Data Lake
- Expertise in configuring and maintaining PostgreSQL.
- Experience performance tuning queries and data models to produce the best execution plan.
- Strong experience building data pipelines & ETL.
- Experience working on an Agile Development team and delivering features incrementally.
- Experience with Git repositories
- Working knowledge of setting up builds and deployments
- Experience with both Windows and Linux.
- Experience demonstrating work to peers and stakeholders for acceptance
- Ability to multi-task, be adaptable, and nimble within a team environment.
- Strong communication, interpersonal, analytical and problem-solving skills.
- Ability to communicate effectively with nontechnical stakeholders to define requirements.
- Ability to quickly understand new client data environments and document the business logic that composes them.
- Ability to integrate oneself into geographically dispersed teams and clients.
- A passion for high quality software. Previous experience as a data engineer or in a similar role
- Eagerness to learn and seek new frameworks, technologies, and languages
- Commitment to working with others and sharing knowledge on a regular basis
- Experience working with Azure DevOps, JIRA or similar project tracking software.
- Experience working in a startup environment
- Experience with data streaming such as Apache Kafka, AWS kinesis, Spark Streaming, or similar tools.
- Experience with many other big data technologies at scale. Experience with BigQuery or similar (Redshift, Snowflake, other MPP databases)
- Knowledge of when to use NOSQL versus traditional RDBMS
- Experience with RDS in AWS a big plus.
- Kafka, RabbitMQ or similar queueing technologies a plus.
- Experience with BI tools such as Tableau and Jaspersoft.