Professional Experience
Results-driven Senior Data Engineer with 14+ years of expertise in data engineering, big data analytics, and automation. Specialized in ETL pipelines, cloud solutions, and BI dashboards, enhancing data-driven decisionmaking and operational efficiency. Enterprise Data Solutions & Automation Designed scalable data pipelines for structured & unstructured data, integrating Java, Scala, Python, and big data tools. Automated ETL workflows, reducing manual effort and improving data accuracy and efficiency. Built cloudbased data architectures, ensuring high availability and performance. Financial Services & Risk Analytics Developed fraud detection and risk models, optimizing financial data processing and compliance. Delivered real-time analytics dashboards to support investment strategies and regulatory reporting. Integrated multisource data feeds (Bloomberg, Reuters, Oracle, Murex) for market insights and risk assessment. Consulting & Technical Advisory Advised clients on cost-effective data solutions, aligning business objectives with technology strategies. Led cross-functional teams, ensuring seamless data integration and process automation. Created custom APIs and cloud-based tools, accelerating data accessibility and business intelligence.
Senior Data Engineer – Cloud Architect
Mercedes benz. | Madrid, Spain
Integrates SAP data into azure ecosystems, built on private network solution using ADF, Databricks and Synapse to ETL migration. Working on risk Data for AI department, where we must Build several financial reports on Power BI pulling data from SAP End to End cycle, developed on top azure technologies like Snowflake, Aws, databricks, ADLS, obtaining a clean architecture to load finance data to build, top management reports to measure business reports as gain a losses, etc. Spearheaded initiatives within the Analytics After Sales department, adhering to agile principles and employing Kanban boards for streamlined project management.
Previous Experience
BASF - Madrid, Spain
Senior Data Engineer - September 2019 to October 2020
Led cross-functional teams to analyze requirements and implement cloud solutions. Integrate corporate data governance, Trained and mentored junior developers.
- Enhanced corporate data governance strategy using Collibra Data Governance Center, improving compliance and metadata management by 70%.
- Incorporated a microservices platform for metadata extraction, leveraging Python, Tornado, and Azure Functions in a serverless infrastructure.
- Designed end-to-end ETL data pipelines using Azure Data Factory and Databricks, processing large volumes of structured and unstructured data for data lakes and warehouses.
- Created reusable mappings, workflows, and transformations, enhancing pipeline efficiency and project consistency by 90%.
- Optimized real-time and batch data processing streams, reducing high latency by 40%.
- Built and containerized services on the Collibra platform using Docker, improving deployment flexibility and scalability, reducing POC delivery time from months to days.
- Engineered CI/CD pipelines for enterprise data lake automation, ensuring seamless data ingestion and environment synchronization, improving data accessibility and governance.
IQVIA - Madrid, Spain
Data Engineer - August 2017 to September 2019
Project, CODE project for IQVIA (Quintiles & ImsHealth). Contributed to the CODE Project, building a pan-European oncological data network across 7+ countries, improving cancer treatment decision-making and cost optimization for hospitals. Giving added value in the decision-making of treatments to patients with cancer, improvement the cost of medicine for hospital and retrieve valuable information for founders.
- Responsible of the technical installations, sizing of the servers and advice in the technical part to facilitate the integration of hospital data into the oncological network.
- Preparing and developing data extraction workflows from multiple systems (Sap, Ad Hoc vendors) adopting the best solution for minimize the risk and exposure critical information
- Set up Anonymization rules for identifiable data using hash techniques to compliant with the GDPR (General data protection regulation).
- Applying data quality process like data control, data cleansing, data modelling, data completion and data recovery across distinct phases, providing a process standardization for each vendor.
- Working with the central team remotely on daily progress review taken an agile approach. Using Kanban for task assignation to complete several milestones.
- Engineered real-time data ingestion pipelines with Spark and scala, enabling near real-time data processing for timely medical insights reducing update lags by less than seconds Built data visualization dashboards in Grafana, delivering actionable insights on treatment effectiveness and platform performance. Enhanced monitoring capabilities reduced web service and dashboard downtimes by 90%.
Telefonica - Madrid, Spain
Data Engineer - July 2016 to August 2017
IFRS compliance project, adapting local and global financial systems to new revenue recognition standards, ensuring regulatory compliance. Enhanced batch processing pipelines using Apache Spark, HDFS, and Parquet, optimizing data ingestion and storage in the enterprise data lake.
- Automated ETL workflows to extract, transform, and load 120GB of Daily data per provider, improving processing efficiency by 40%
- Applied custom data transformation rules using Hive UDFs, ensuring data integrity and business rule compliance.
- Engineered data integration workflows to ingest and validate SAP HANA transactions, streamlining financial reporting.
- Created real-time monitoring dashboards in Grafana, enhancing financial and operational visibility. Incorporated data validation and auditing processes, ensuring compliance with IFRS and internal financial standards.
- Built Oozie-based orchestration pipelines, automating Spark job execution and managing dependencies. Improved job scheduling and monitoring, reducing manual intervention by 80%.
- Configured AWS S3 and HDFS workflows, enabling seamless cloud-based data ingestion and processing.
- Collaborated with cross-functional teams and stakeholders, providing technical solutions to business challenges.
Atos - Madrid, Spain
Financial Services Consultant - June 2015 to July 2016
Started the Cetelem Bank Leasing Project, designing and implementing legal, financial, and accounting reports for early-stage leasing operations.
- Crafted 100+ ETL workflows to automate data ingestion, transforming operational data into regulatory-compliant reports.
- Developed financial and legal reports with TIBCO Jaspersoft, streamlining reporting processes across Legal, Accounting, and Marketing. Increased data availability by 80%.
- Engineered data structures to support financial transactions, SEPA payment automation, and ledger integrations.
- Built Oracle PL/SQL queries to extract and consolidate financial data from several systems, increasing data quality by 90%.
- Provided technical consulting and system integration for Miles Leasing Software, enhancing fleet sales management by 25%.
- Advised senior stakeholders on data-driven insights, improving operational efficiency and financial decision-making.
- Developed proof-of-concept (POC) solutions on AWS Redshift, demonstrating the benefits of cloud-based parallel query execution over traditional database systems.
BBVA - Madrid, Spain
Data Engineer - June 2014 to June 2015
While working as a member of the BBVA bank fraud prevention group, my job was to process, analyze large volumes of data and credit card transactions. I also handled large volumes of data with sensitive information, having broad access to the infrastructure on a variety of architectures with multiple data sources. In addition to the analysis and quantification of fraud, we gave technical support on fraud detection and provided alerts in real time.
- Structured fraud detection processes to analyze large volumes of credit card transactions, identifying suspicious activity in real-time.
- Built data pipelines using PL/SQL, Bash, and Java, automating batch processing and optimizing ETL workflows for fraud prevention and mitigating illegal activities by 70%.
- Designed and maintained cybersecurity dashboards in Highcharts JavaScript, providing real-time KPIs on fraud metrics for stakeholders.
- Executed risk monitoring algorithms using Talend and Intellinx, detecting unauthorized access attempts and brute-force attacks, saving fraud card on CVV duplication by 20%.
- Engineered Control-M job scheduling scripts to automate periodic fraud analysis and reporting. Unified RSA and REST APIs with fraud detection systems, enhancing authentication security by 90%. Conducted data modeling and validation to ensure compliance with internal and external cybersecurity regulations.
- Ensured optimal performance and accuracy of fraud detection systems by providing technical support to cybersecurity teams.
Social Security - Ministry of Spain - Madrid, Spain
Analyst Developer - March 2012 to April 2014
In the IT area of data repository of social security management, we performed analysis tasks, development and performance improvements in the catalogue of applications Social Security ASG-Rochade.
- Participated in data migration projects from Oracle to ASG-Rochade, ensuring seamless integration of government metadata systems.
- Programmed Java-based ETL pipelines to automate data extraction, transformation, and loading, improving metadata consistency.
- Enhanced SOAP-based web services, enabling secure and efficient data exchange between government applications by 90%.
- Optimized performance of metadata repositories, reducing query response times by 30% through indexing and caching strategies.
- Automated impact assessment processes for application lifecycle management, improving data governance and compliance.
- Built customized reports to track the integration of metadata management tools (Rochade, Metability, Web Access) with legacy systems.
- Applied Agile methodologies to streamline software development cycles, ensuring timely project delivery.
Metryc - Madrid, Spain
Principal Engineer - May 2010 to March 2013
As a freelance at Metryc, Start-up founded with private capital, dedicated to providing digital solutions biometric payment, such as facial recognition or fingerprint sensors.
- Introduced Antiphishing VeriLook, a biometric payment platform utilizing facial recognition and fingerprint authentication for secure transactions.
- Created backend services using Java, Spring, and iBatis, applying MVC architecture to ensure modular and scalable development.
- Engineered end-to-end encryption protocols using Blowfish cipher, securing communication between client-server transactions and increasing security by 99%.
- Created certified PDF reports to validate and audit biometric transactions, ensuring regulatory compliance for financial institutions.
- Included facial recognition SDK integration, leveraging Neurotechnology VeriLook for identity verification; migrated front-end technology from Flash (ActionScript) to HTML5, improving cross-platform compatibility and security.
- Provided technical consulting on biometric authentication systems, advising banking institutions on secure payment infrastructure.
ISBAN -Santander bank - Madrid, Spain
Programmer - March 2012 to April 2014
Data migration and integration of functionality in Asset Control 5.0 upgrade to 6.2.
- Optimized ETL pipelines for market data ingestion from multiple sources (Reuters, Bloomberg, Triach, Murex, Condor), handling thousands of daily unstructured files.
- Automated data loading workflows using Unix scripting, reducing manual intervention by 80% and improving processing efficiency.
- Served backend services in Java Spring & IBATIS, enabling scalable and reliable data transformations. Formulated a Java Swing-based real-time dashboard for monitoring data loads, lifecycle tracking, and anomaly detection improved by 70%.
- Applied business control mechanisms to detect data outliers for risk calculations, improving data accuracy by 40% for fixed income and equity markets.
- Enhanced Formula Engine capabilities for derivatives pricing, standard deviation calculations, and volatility analysis in risk modeling.
- Standardized and synchronized financial data across diverse formats (Excel, CSV, XML, TXT), ensuring compliance with regulatory and risk assessment frameworks.
