Software Services
For Companies
For Developers
Products
Portfolio
Build With Us
Build With Us
Senior Data Engineer/DevOps
Python
6 Years Of Experience
Arinze
Lagos, Nigeria
Skills
Expertise
Apache
7 years
Apache Kafka
7 years
MySQL
7 years
AWS
6 years
Data
7 years
MongoDB
4 years
MSSQL
7 years
PostgreSQL
7 years
Big Data
7 years
Python
6 years
Scala
7 years
All Skills
Apache
Apache Kafka
MySQL
AWS
Data
MongoDB
MSSQL
PostgreSQL
Big Data
Python
Cloud
Scala
Senior Data Engineer/DevOps
Arinze is a Senior Data Engineer/DevOps with over 7 years of experience. He has worked in technologies such as Hadoop, Kafka, Spark, Flume, Hive, Airflow, Storm, and a lot of Database related technologies such as Power BI, Tableau, Snowflakes, and Redshift. Extensive knowledge of Scala and Python.
Professional Experience
Head Database Architect
7 months
Unity Bank
• Transforming logical information model into first cut database design as well as defining data rules and schema design that meet the required business model.
• Define and maintain the architectural frameworks/patterns, processes, standards and guidelines related to systems, business, and data architecture.
• Developed and established high-level data flow diagram, data standards, and naming conventions and evaluated the consistency and integrity of the model and repository.
• Performance tuning for ETL code, stored procedures and SQL query optimization and process automation.
• Database migrations from Traditional Data Warehouses to Spark Clusters.
• Created data mapping documents for movement of data from various data sources into the Microsoft Azure.
• Creating new enterprise data designs for transactional (OLTP) systems, data warehouses.
• Developed and implemented migration and upgrade procedures for databases.
• Lead and apply best practice to Data Modeling, Data Mapping, Data Anomalies, and Data Discrepancies analysis.
• Mine and analyse data from different databases when necessary to drive optimization and improvement of product development.
MSSQL
Python
Azure
Apache Kafka
Senior Data Engineer
5 years 2 months
Zenith Bank
• Using DMV's to analyse wait statistics, latches, spinlock, CPU and memory pressure, I/O bottleneck and tempdb Optimization.
• Query optimization by analysing query plan, column statistics, index statistics and Table Partitioning.
• Using AWR(Automatic Workload Repository) to troubleshoot and monitor performance which involves analyzing ADDM reports and ASH reports.
• In depth experience in the design, administration, implementation and support of a database using Oracle RAC, Data Guard, Oracle 12c and OEM 12c.
• Strong knowledge of Installation, configuring and maintaining of Oracle GRID infrastructure and applying of oracle Quarterly patches.
• Deploying new instances of SQL Server to Windows Clusters , Virtual Machines both on Premise and in Azure, as well new SQL Azure instances.
• Responsible for administering, analysing, recommending, designing, developing, enhancing and supporting data structures and processes required to meet business objectives.
• Interact with development and end-user personnel to determine application data access requirements, transaction rates, volume analysis, and other pertinent data required to develop and maintain integrated databases.
• Build, configure and maintain replication on MySQL Server, Hbase and MongoDB servers.
• Work closely with the data warehouse team to assist in the design and optimization of Extraction Transformation and Load (ETL) processes.
AWS
Azure
PostgreSQL
MongoDB
Data Engineer/Analyst
2 years 9 months
Face Photo Industrial
• Constructed product-usage SDK data by using PYSPARK, Scala, Spark SQL and Hive context in partitioned Hive external tables maintained in AWS S3 location for reporting, data science dash boarding and ad-hoc analyses.
• Involved in Creation of tables, partitioning tables, Join conditions, correlated sub queries, nested queries, views, sequences, synonyms for the business application development.
• Used Spark and Scala for developing machine learning algorithms which analyses click stream data.
• Experience in creating data lake using spark which is used for downstream applications Designed and Developed Scala workflows for data pull from cloud based systems and applying transformations on it.
• Developed pipelines to pull data from Redshift and send it to downstream systems through S3 and performing Sftp.
• Responsible for the support data transfer, import-export, reports, user queries, and problems.
• Responsible for working to analyze, design, build, deliver and maintain Meta data, BI dashboards and other data extracts requested.
• Performed in agile methodology, interacted directly with entire team provided/took feedback on design, Suggested/implemented optimal solutions, and tailored application to meet business requirement and followed Standards.
• Gathered and analyzed the data requirements and designed a database solution for a project that replaced several legacy mainframe systems.
• Assisted in developing system requirements, analyse transactions and data volumes, and designs & develops complex databases and applications Participated in large-scale development initiatives including product selection and implementation.
• Acquire data from transactional source systems to the data warehouse (typically using Python, Spark, Amazon EMR or Amazon Kinesis)
• Experience designing DAGs using AirFlow/Luigi/AWS Data Pipeline. (AirFlow preferred).
• Experience with Python, R, Matlab/Octave or similar advanced data analysis programing language for development of basic Machine learning scripts
Spark
AWS
Big Data
Python
Education
Covenant University
BSc Information and Communication Engineering
Developer Profile
Leadership:
Score 4: Capable Tech Lead, drives technical strategy, and envisions project scope.
Communication:
Score 5: Expertly conveys technical scope and vision, aligns tech solutions with business requirements, and mentors in effective communication.
Professsional Experience:
Score 5: Veteran Technologist - Carries vast experience, having helmed diverse and complex projects across industries, and showcases a history of strategic and impactful project completions.
Engineering Skill:
Score 4: Tech Lead/Architect - Oversees large-scale projects, defines architectural decisions, and strategizes tech solutions.
Professionalism:
Score 5: Elite Professional - Embodies the highest standards in all interactions, mentors others in professional growth, and sets the benchmark for workplace excellence.
Skill Evaluation: 9/10
Professional Strengths
- Versatile Data Engineer with over 10 years in total working in different data related positions.
- Great DevOps skills with experience working in several of the leading Cloud providers, as well as the traditional server way.
About me
• Hi, I am Arinze. Will be a pleasure to be working with you on any Data related project.
About
Global Hubs
Engineering Skills
Social Media
@2026 slashdev.io
