Könnte verfügbar sein
(Aktualisiert 2022-07-19)Senior Front-end and Data Engineer
København, Denmark
Einheimische Spanish, Fließend English, Anfänger Danish
- Data engineer
- Fullstack cloud engineer
- Frontend (Javascript)
Fähigkeiten (35)
Excel
BI
ETL
HTML/CSS
AWS EC2
Python
Business Intelligence
Amazon Elastic Compute Cloud
COMPETITIVE INTELLIGENCE
API
Amazon Web Services
SQL
MARKET INTELLIGENCE
TEST CASE DEVELOPMENT
Google Cloud Platform
Azure Cloud Services
Microsoft Azure
METRICS
Jira
AWS S3
AWS Lambda
Google Sheets
Serverless Architecture
Node.js
React
Docker
Google Cloud
Confluence
React
Vue
Kubernetes
Salesforce
SAS
MongoDB
PROBLEM SOLVER
Zusammenfassung
Javier is a certified AWS cloud consultant and technical expert in data engineering, Big Data, ETL, BI, web development and NLP. Javier is a polyglot programmer, but his main language expertise lies within Python and Javascript/Typescript and the platforms supporting these languages, like node.js and Python libraries. He is passionate about improving and using his skills in development and data engineering, in a professional and a private environment. Javier finds great motivation in projects with a positive sustainability impact on society.
Javier has various experience in developing full-stack applications centered around data science/data engineering, visualization of business data and cloud services. He has experience with Artificial Intelligence, NLP and Machine Learning. He is a skilled data engineer, and has used various cloud native services to produce working data-driven solutions for real-world problems.
With his knowledge and experience Javier can act both as a technical consultant/architect and a developer implementing cloud solutions with AWS, and Azure. He has former experience in development and consultancy with Google Cloud Platform services and IBM.
Javier is a great self-starter, who places great pride in elucidating customers' requirements to ensure successful implementations. He prefers to work in an agile and iterative manner, for example by creating POCs where the customer can validate the solution in increments and adjust as needed.
Berufserfahrung
2021-11 - Jetzt
2022-03 - 2022-05
Software Developer
Responsible for enabling Jira's Ultimate Customizer with its correct UI/UX in signups, approvals, single portals, portals page, users login, profile page, create request, view request, requests page, and for fixing permissions error when editing login screen for the minor version 4.3.0.
2022-01 - 2022-03
Integration between Monday.com and Kortinfo
Responsible for developing an automatic real time synchronization solution for a client from monday.com to KortInfo (web map - GIS - provider)
Project description:
The client used two systems holding the same information for different purposes (see projects on the map and manage these projects on a project management tool). They lacked an automatic way to send new projects and updates to the web map environment. The goal of the project was to implement a solution in a "serverless" environment that listens to some critical changes in monday.com.
Tasks:
Solution design and requirement elucidation with the client and third-party developer
Making a javascript interface for transforming some REST/GraphQL API data to the necessary web map GIS provider API real-timePOST requests.
Monitoring these changes through some RSS connected to a business intelligence report to check if there could be potential issues after going live.
Technologies used:
JavaScript, REST, GraphQL, Zapier.
2017-01 - 2021-10
2021: BI/Data Engineer projects
Responsible for developing productivity reports based on Azure Log Analytics through PowerBI in 3 T-Systems internal projects, and for a T-Systems client (Volkswagen Spain).
Project description:
The project aimed to filter, aggregate, and plot-critical logs which provided insights for cost savings and better solution offerings. The monitorization solution was then sold to existing T-system clients.
The data engineering side project with Vokswagen Spain consisted of a Informatica PowerCenter ETL aiming to migrate their client management data warehouse.
Tasks:
Consulting on data structure requirements
Performing ETL with Azure Data Factory or with Informatica PowerCenter
Extracting the processed information from a data lake to Power BI
Creating data model on Power BI
Creating a Power BI dashboard to monitor desired KPIs
Technologies used:
PowerBI, ETL, Azure Log Analytics, Power Query, Informatica PowerCenter.
2020: Big data technician
Responsible for setting up cloud environment and migrating data for data scientists in a large Insurance Company. In this project, Javier used Python for implementing the business logic
Tasks:
Migrating all the on-premise houses insurances dataset to the cloud
Handling their encoding issues
Performing anonymization/data masking in the edge
Normalization of data for a cloud automated system for ingesting new data
Technologies used:
Python, AWS Lambda, AWS Step Functions, AWS S3, AWS Athena, AWS IAM and remote connections.
2019: Data engineer and Power BI technician
Development and architecture for an ETL and BI project.
Project description:
The client is the biggest gas distributor in Spain, and the overall aim of the project was to monitor the financial reports to the headquarters. The task was to migrate their data from their heavy Excels with macros to a centralized, automated, and data-optimized cloud system on AWS along with Power BI reporting. This was achieved by using Python along with appropriate data science Python libraries like pyspark, pandas and numpy for doing the ETL logic.
Tasks:
Extract data files from different cloud sources (SAP, Salesforce, Excel documents) to Amazon S3 buckets
Get the payload of the data along with calculations with Amazon Glue (boto3 and pyspark libraries)
Dump the resulting data in Amazon Redshift tables which were taken by Power BI for its data representation
Having a precise Power BI dashboard for the company's headquarters
Technologies used:
Amazon S3, Amazon Glue, python (pyspark, pandas, numpy), Amazon Redshift, SQL, Power BI.
2018 - 2019: Data engineer and Power BI technician
Development and architecture in an ETL project
Project description:
The client (one of the most prominent private online universities in Spain) wanted to represent KPIs to organize personal schedules for lessons. The project was to analyze which courses had higher or lower demands and distribution of the alumni by gender, occupation, and interests. Javier was responsible for the data engineering process and Power BI tasks to plot the results.
Tasks:
Extract data from both an Oracle database and Google Sheets spreadsheets
Transform the data with Data Lake Analytics and Azure App Service within Azure Data Factory
Loading data in Data Lake Storage CSVs
Loading these CSVs from Power BI for its data representation
Create Power BI metrics related to certain KPIs
Represent the data based on these KPIs
Technologies used:
Google Cloud Platform (Google Spreadsheets API), Data Lake Analytics, Azure App Service, Azure Data Factory, Azure Data Lake Analytics, C#, Azure Data Lake Storage, Power BI, DAX.
2017-2018: Software developer
Full chatbot development for an insurance company.
Project description:
A big insurance company in Spain wanted to implement a chatbot on their webpage to get a lower workload on the phone for easily answered questions. The task was to create it and implement it on the bottom of their webpage (loads when page fully renders). The development was done using Javascript, Node.js, Python, SQL and R
Tasks:
Develop a fully functional and customizable chatbot (now in vivaz.com, loads when page fully renders) from a basic Microsoft template
Teach an intern how to develop and coordinate tasks with him
Technologies used:
Amazon Web Services (AWS EC2, Amazon CodeCommit, Amazon Redshift, Amazon Glue and Amazon S3 among others)
Microsoft Azure (L.U.I.S., QnA, App Service, Data Lake Storage, Azure Data Factory, Data Lake Analytics)
IBM Watson, Oracle Intelligent Bots, Google Cloud and SAS
Programming in Javascript, Node.js, Python, U-SQL, R
2017: Data mining technician
Data mining technician and machine learning.
Project description:
In the border control between the United States and Mexico, tolls needed to detect specific vehicles searching for drug dealers. The project aimed to automatically detect vehicles through a laser scanner for the system to filter out potential vehicles that could carry out illegal activities. The logic was implemented using Python.
Tasks:
Create a Python artificial intelligence able to process laser vehicle profiles
Train the model with thousands of vehicles
Enhance profiling detection of the vehicles
Technologies used:
Development of an artificial vision software in Python to classify vehicles
2016-01 - 2017-01
Building a chatbot
Technologies used:
Java programming, data mining, and AI training using different APIs from IBM Watson Cloud, Amazon Lex, and Instagram
Pre-sales technician for IBM Watson Health solutions for Portugal, Spain, Greece and Israel, with special focus on Watson for Drug Discovery
Akademischer Hintergrund
2014-09 - 2015-06
2011-09 - 2014-06
Zertifikate
2022-06
2021-01
2017-01
2022-03
2016-01