Senior Data Engineer Stockholm, Sverige

Might be available

(Updated 2022-10-13)

Senior Data Engineer

Stockholm, Sverige

Native English, Hindi, Telugu, Beginner Svenska

  • Cloud Migration
  • Advanced Data Analytics
  • Business Intelligence & Data Warehouse

Skills (27)

Cubes

Business Intelligence

Python

Power BI

SQL

Iac

Spark

VISUALIZATION

DATA SOURCES

Data Pipelines

DAX/MDX

Dell Boomi

PowerShell

Informatica

Alteryx

Bash

CI/CD

Big Data

RETAIL

ANALYTICS PLATFORM

Telecom

GCP

AIRFLOW

Kafka

Scala

Docker

TIME MANAGEMENT

Summary

Vijetha is a dynamic professional with rich experience in designing and developing large-scale batch and real-time data pipelines, optimized for scaling.
She has worked with different industries (manufacturing, retail, Life Sciences as well as Travel & Transportation) and across different geographies (Europe, Asia and USA).  

In her position as a Senior Data Engineer, she was involved in the end-to-end requirement specification, design and development of Bigdata pipelines using her wide experience in various clouds like AWS, Azure and GCP.  She also has experience in GitHub, AzureDevOps and building and creating CI CD pipelines. She worked with various product teams, enabling them to make the most out of their data.

Vijetha worked as a technical architect and has vast experience both in ETL and front-end tools where she has contributed to the design, architecture and development of many projects involving huge data sets and advanced analytics.  

Professional Experience

Aigometri AB

2019-08 - Present

Data Engineer
H&M Omni

2022-01 - Present

Retail Services The purpose of the proejct is to have a single point of access for all the source systems along with new onboarded data streams Tasks/ Achievements • Integrate the data from various source streams into a single delta lake • Apply Databricks transformations for bronze, Silver and Gold layers using Delta Live tables, Autoloader • Implement the logic for batch and stream processing • Implement the principle of least privilege on AAD and secure the infrastructure • Migrate the Azure DevOps code repo to Github repository • Create github actions ci/cd pipelines using yaml • Performance tune the spark transformations • Assist and guide the users in creating Power BI datamarts • Cloud migration activites(data movement) from azure to GCP • POC to move the databricks transformations to BigQuery • Spike on Looker and BigQuery to weigh the success criteria
Data Engineer
H&M Online

2021-03 - 2021-12

Retail Services The aim of the proejct is to migrate the data warehouse and new data streams from onprem to Azure Cloud Tasks/ Achievements • Create the infrastructure using ARM,BiCep and later in Terraform • Build CI pipelines(sonarqube, coverity) to validate the code being sent to version control using Azure DevOps • Create the deployment(CD) pipelines for continuos deployment • Develop and maintain data pipelines to onboard the data • Implement network and Infrastructure security • Big data processing on Synapse using spark transformations • Create Azure Analysis services tabular models and connect them to Power BI
Data Engineer
Telia AB

2020-06 - 2021-02

Telecom Services The purpose of the project is to create a POC on AWS platform to onboard the onprem data streams to cloud and then to a nonsql database and machine learning pipelines Tasks/ Achievements • Create secure scalable cloud network configuration for VPC, Subnets, IGW, TGW, NAT and Routetable etc., • Enable Cloudwatch. Cloudtrail to monitor the environment • Build a first-cut solution to integrate on-prem Kafka to AWS MSK then to S3 and apply storage lifecycle management policies for data • Simulate low latency data flow to write MSK topic to DynamoDB using lambda function code • Infrastructure as code via CloudFormation and terraform • Test the user data for EC2 instance in docker containers • Create SPARK streaming jobs on glue for the data movement and run them from private EC2 instance
Data Analyst
Marakanda AB

2019-12 - 2020-04

Telecom Services Using a light weight SDK deployed inside any host app, Marakanda collects the device insights. Device insights capture approximately 50 unique data attributes per user/collection. Smartphone behavioural data aids in generating unique upgrade and churn predictions. Thus enhancing marketing ecosystems enabling the mobile operators to target the right customers at the right time for the right
Infinite Computer Solutions

2019-03 - 2019-07

Dover Corp, India • Experienced professional on Agile, Waterfall methodologies
Technical Architect
Kaseya

2019-03 - 2019-07

IT Management Services The purpose of the project is to create a standardised data warehouse for Kaseya and related entities, which results in single version of truth in terms of key performance indicators and cost reduction associated with lower system maintenance Tasks/ Achievements • Design the end-to-end data pipeline to integrate different source CRM systems into a single datalake.
• Design the data warehouse in AWS RedShift for CSX and NC customers.
• Implement Amazon Aurora database for Unitrends customers • Automate the data pipeline to move JSON data in S3 to RedShift to ensure maintenance free system.
• Migrate the existing transofrmations in Alteryx to Dell Boomi • Implement Master Data Management(MDM) in DellBoomi to arrive at Customer master data in the Data Lake • Create and manage workspace for the users and auto refresh PBI datasets using REST APIs • Embed Power BI reports on SalesForce portal for easy access • Manage row/object level security for user groups
IBM Pvt Ltd

2017-08 - 2019-01

reasons eventually leading to happy customers.
IBM Pvt Ltd, India Tasks/ Achievements
BI Architect
Markem-Imaje

2017-08 - 2019-01

Manufacturing Services The objective of the project is to provide a single platform for Sales and Supply chain management operations of MI. Thus, the reporting solutions would let the users to do a drill-across from either of the segments Tasks/ Achievements • Analyze the source system operations (SalesForce &SAP) to implement in BI environment.
• Integrate data across SAP and SalesForce for single BI stream.
• Create proof of concepts on Microsoft Azure Data factory to integrate data from cloud and on-premise data sources • Develop advance PL/SQL packages, procedures, triggers and collections to implement business logic using SQL navigator to move/manipulate data from data warehouse to datamart • Create PowerBI dashboards using datasets/dataflows to distribute to the higher management for critical decision making • Customize DAX functions and M-query in the desktop and Power Query respectively • Create proof of concepts on Power BI dual mode, shared dataset
Hewitt Associates

2011-07 - 2017-08

Collect app data from IoS/android devices through Kinesis firehose Hewitt Associates, India
Product Specialist
H&M

2014-04 - 2017-08

Retail Services The purpose of the project is to stabilize and merge the analytics platform for Online and Store data. Administer the Analytics application and the respective backend databases in SQL server.
Tasks/ Achievements • Identified the technical needs of the organization that cause delays/risks and closed the gaps by proposing/implementing innovative solutions(LCM, Informatica Metadata Manager) foolproof testing and analysis tools • Merge existing analytics platform(BI Online and Store) in H&M into a single platform to ease the user application access • Automate cognos report testing during environment switch using Lifecycle manager. Write the test cases accordingly • Explore included services in Microsoft Power platform to enable automation and enhance process efficiency.
• Create a custom tool to extract server information from SSAS cubes which aids in switching the environments efficiently.
• Performed product evaluations and Proof of Concepts to analyze the workload/risks in migrating Microsoft BI cubes to Dynamic cubes.
Thus, a single environment for Online and Sales customers in the backend • Application Integration between frontend and backend with Informatica metadata manager, thus minimize time needed to do an impact analysis and data lineage on the system
BI Analyst
CMA-CGM

2011-07 - 2014-04

Travel and Transportation The mission of project is to provide proactive and innovative carrier tracking service to the CMA-CGM customers by creating high quality business monitoring dashboards(Diva).And also, by implementing accurate backend database(Opera) for the BI system Tasks/ Achievements • Collaborate and Design the Carrier tracking database 'Opera' (Logical and Physical) in SQL Server database using CA Erwin data modeler.
• Develop mappings in Alteryx to Extract, Load and Transform data from Excel and Oracle into SQL Server to have a single source for the reports.
• Create ETL jobs using different stages like transformer, Sequence Generator, lookup, aggregator, join, merge, filter and funnel to cater the reporting needs
Accenture Pvt Ltd

2010-02 - 2011-07

Refine and pull the datalake(S3) events using python code wrapped Accenture Pvt Ltd, India
BI Programmer
Hewitt Associates

2006-01 - 2011-01

India Payroll Services Wyeth, North America LifeSciences Merck CSI, US Healthcare Services General Information General Information
noSQL

2006-06 - 2010-02

• Collect the device/user specific encrypted keys(GDPR complaint) in DynamoDB(noSQL) to data warehouse level • Create external tables using Redshift Spectrum to be able to refer athena tables in Redshift General Information • Create parquet/ORC files on S3 for the external systems to be able to use BI processed data cost effectively • Create ETL glue jobs to identify device/user uniquely using different attributes and data points • Generate QuickSight dashboards and datasets via athena • Integrate thirdparty app data to the existing BI system by API call to enrich the existing data • Write python code for different tasks, dependencies for each job in workflow management and automation using apache airflow tool

Academic Background

B. Tech
JNTUK University

2024-09 - 2024-09

Certifications

Microsoft Azure Administration Associate
IBM Analytics Multidimensional Author
Professional Dell Boomi Developer
AWS Certified Cloud Practitioner
Associate Dell Boomi Developer
Microsoft Azure Fundamentals
CMA

Contact contractor

/