Tahir Khalid
- Riyadh, Saudi Arabia
- February 25, 2021
I have 10+ years of IT plus Telecommunication experience and my credentials are verified by NZQA and EU Blue Card Network.
Currently I am working with Huawei Technologies for more than last 7 years under contractor- ship of Seder Group (now known as Fircroft) on STC Project
(Saudi Telecom Company) in Saudi Arabia.
Working as Reporting and Data Analysis Engineer, with following technologies and job responsibilities
- Analysis of Big Data by writing SQL Apache Hive queries on Hadoop (HDFS) Cluster under HUE/Ambari UI.
- Writing complex SQL Hive queries on Hadoop (HDFS) Cluster to extract the data as per customer requirement
- Scheduling and Managing Hadoop jobs by preparing Simple Workflows using Apache Ooze.
- Importing and Exporting data to and from Hadoop HDFS cluster using Sqoop
- Using R & Python Language scripts to Automate data Extraction process for the Reports
- Making ETL transformation using Pentaho data tool& schedule the jobs using window scheduler
- Administration of M.C storages connected Hadoop DB servers
- Automation of excel Report using VBA Macros
- Writing Linux shell scripts and windows batch
- Preparing Report on network KPIs of 3G/LTE Networks for S.T.C using CEM Huawei DB/SEQ System
- Providing Huawei Big data Platform support to CEM Application for Saudi Telecom (S.T.C) Users
Besides this I also worked on following tool and Languages
- Oracle E-Business Suite, R12 Oracle Certified)
- Oracle Application Frame (OAF),
- Java certified programmer
I hope my skills and experience best suited your job offer and you will give due consideration to my application. Hope to hear soon from you.
Thanks
Regards
Tahir Khalid
Contact Mobile No. +966507205037
Email tahirkhalid@hotmail.com
Education
Master of Business and Information Technology
Experience
Working as Reporting and Data Analysis Engineer, with following technologies and job responsibilities
Analysis of Big Data by writing SQL Apache Hive queries on Hadoop (HDFS) Cluster under HUE/Ambari UI.
Writing complex SQL Hive queries on Hadoop (HDFS) Cluster to extract the data as per customer requirement
Scheduling and Managing Hadoop jobs by preparing Simple Workflows using Apache Ooze.
Importing and Exporting data to and from Hadoop HDFS cluster using Sqoop
Using R & Python Language scripts to Automate data Extraction process for the Reports
Making ETL transformation using Pentaho data tool& schedule the jobs using window scheduler
Administration of M.C storages connected Hadoop DB servers
Automation of excel Report using VBA Macros
Writing Linux shell scripts and windows batch
Preparing Report on network KPIs of 3G/LTE Networks for S.T.C using CEM Huawei DB/SEQ System
Providing Huawei Big data Platform support to CEM Application for Saudi Telecom (S.T.C) Users