Work With High Performing Talent On Demand

contact@hiredigital.com | 1 (617) 528-0943

Work With High Performing Talent On Demand

Work With Talent
Fernando Ferrer

Data Analytics Expert

Fernando Ferrer

With expert knowledge of Postgres, Oracle, Redshift, & SQL.

Fernando Ferrer is a data consultant specializing in big data analytics and statistical software implementation. He has expert knowledge of Postgres, Python, Redshift, Oracle, Hadoop, SQL, and MongoDB. His clients include AddShoppers, SkillerWhale, AppviewX, Sante Circle Health, SERMO, and Government of Estonia, among others.

Data Analytics
Big Data Analytics
Data Engineering
Data Warehousing
Data Science
Postgres
Redshift
Hadoop
SQL
MongoDB
Python
Oracle
Data Extraction / ETL
MySQL
PHP
Java
Fernando Ferrer

Fernando Ferrer

With expert knowledge of Postgres, Oracle, Redshift, & SQL.

Data Analytics Expert

Fernando is Available for Projects

Work with Fernando

Employment Highlights

Senior Data Engineer & SQL Developer

Independent

June 2017 - Present (6 years 11 months)

Technology Director

Immutable Data

August 2017 - Present (6 years 9 months)

Data Engineering Consultant

Hockeystick

May 2017 - July 2017 (3 months)

Data Engineer

Independent

June 2013 - April 2017 (3 years 11 months)

Education Highlights

Advanced Diploma, Computer Programmer Analyst

Saint Lawrence College

September 2006 - June 2008 (1 year 10 months)

BSc, Software Engineering

URBE University

September 2002 - June 2005 (2 years 10 months)

Portfolio

Resume

Resume

Resume

Senior Data Engineer & SQL Developer

Independent

June 2017 - Present (6 years 11 months)

Fernando Ferrer has worked on data projects for various clients:


As a senior data engineer for AddShoppers:

  • Designed ETL pipeline to move over 100 million daily events from MongoDB to Postgres.
  • Designed data warehouse using BigQuery.
  • Refactored Python Django app to instrument events properly.

As a training consultant for SkillerWhale:

  • Developed custom Postgres curriculum for corporate training, as well as test for different database courses.

As a MongoDB Administrator for AppviewX:

  • Performed performance tuning of replica set.
  • Created technical manuals for deployment and administration of a MongoDB Replicaset.

Technology Director

Immutable Data

August 2017 - Present (6 years 9 months)

Immutable Data is an elastic data engineering and data science consulting firm. Fernando has managed several clients.


As a chief data officer for Sante Circle Health:

  • Designed and architected secure data infrastructure on AWS.
  • Managed a team of software engineers (three full-stack).
  • Designed data warehouse on Redshift.
  • Hired and mentored new talent.
  • Designed and developed policies and procedures to be followed by the organization.
  • Secured AWS infrastructure to ensure SOC Type 2 and HIPAA compliance.

As a senior data engineering consultant for SERMO:

  • Built data warehouse in redshift to ingest over 1 billion records and reduce BI query time to 1 second or less.
  • Built custom python ETL pipeline to move around 2 million data points per day from Postgres OLTP to Redshift to feed ML algorithm in order to mine physician posts and understand what there interest are.
  • Administered SISENSE cluster and reporting platform to centralize BI reporting and reduce report loading to 5 seconds or less.
  • Developed custom queries and SQL procedures for ad-hoc reporting.
  • Redesigned data warehouse model to accommodate multi-dimensionality of data.
  • Led the data team with eight engineers and two scientists).
  • Maintained relationship with decision maker.
  • Deployed ETL works across the organization.
  • Trained engineers on Airflow deployment.

As a data engineering project lead for Kinduct Technologies:

  • Administered JIRA board.
  • Trained and developed in-house talent on project management and data engineering.
  • Performed hiring interviews.
  • Developed employee training and development map.
  • Architected ETL pipeline using airflow, MySQL, Postgres, Redshift, and Snowflake.

As a director of technology and training for NobleProg:

  • Developed training curriculum for courses in the data engineering and data science realm.
  • Maintained client relationships.
  • Delivered training for clients like L’Oreal Canada, Department of National Defense, Logitech East Asia, Government of Mexico, Bell Canada, TD Bank, and California Revenue Board.
  • Awarded the Best Rated Training of 2018.

As a data engineering consultant for Androdon Capital:

  • Developed a data pipeline using Python and google cloud as backend to move over 357 million records.
  • Coded proprietary FX and stock trading models in C#.
  • Processed over 3 million data points a day.

As a data consultant for the Government of Estonia:

  • Developed a market report on the state of data scientist job market in North America.
  • Introduced the idea of creating a conference focus on technology in North America. That idea matured to become what today is Latitude 44.

Data Engineering Consultant

Hockeystick

May 2017 - July 2017 (3 months)

Hockeystick uses machine learning and data to help startups connect to funders.

  • Built data warehouse data model.
  • Developed infrastructure for DaaS offering using AWS stack.
  • Cleaned ETL custom tool in Python to aggregate and clean data.
  • Developed an operating budget for the data department.
  • Created custom API using Python and Flask.
  • Recruited engineering talent.
  • Improved query times by 50% by altering the data model and improving indexes.
  • Wrote Ruby on Rails controller to extract application data in real-time.

Data Engineer

Independent

June 2013 - April 2017 (3 years 11 months)

  • Developed name matching algorithm to match web indexed data to physician database with an accuracy above 90% using Python and MySQL.
  • Worked with data science team to collect application usage data to improve retention and usage.
  • Developed ETL pipeline to ingest ICD9 and ICD10 codes from multiple sources in order to determine different key metrics of all medical procedures performed in USA.
  • Created BI system hosting over 160 billion records.
  • Developed career plan and compensation for the engineering department.
  • Reduced query times by 66% by improving indexes and query caching.
  • Reduced data ingestion times by 85% by moving on-premise infrastructure to AWS.
  • Reduced cost of infrastructure by provisioning AWS reserved instances.
  • Refactored front end app to use MongoDB.
  • Developed ETL tool to move data from Hadoop EMR to MongoDB.
  • Worked with data scientist to create a network of influence algorithm.
  • Led technology audit prior to selling the business.
  • Negotiated with counterparty during acquisition.

Education

Advanced Diploma, Computer Programmer Analyst

Saint Lawrence College

September 2006 - June 2008 (1 year 10 months)

BSc, Software Engineering

URBE University

September 2002 - June 2005 (2 years 10 months)