Freelancer suchen Auftrag erstellen

Willkommen auf dem Maltprofil von Muhammad Moiz !

Bei Malt können Sie auf eine Reihe der besten Freelancer Talente für all Ihre Projekte zugreifen. Sie können Muhammad Moiz kostenlos kontaktieren und sich im Chat austauschen, oder andere Freelancer anschreiben und unverbindliche Angebote einholen.

Muhammad Moiz Ahmed

DWH, MicroStrategy, Python, Big Data, Data Vault

Kann in folgende Städte reisen: Berlin, Berlin, Hannover, Hamburg, Frankfurt am Main, Köln, Düsseldorf, Essen, Nürnberg

  • 52.517
  • 13.3889
  • Unverbindlicher Tarif 700 €/Tag
  • Berufserfahrung 7 Jahre und +
  • Antwortrate 75 %
  • Antwortzeit 12h
Angebot einholen Es handelt sich um ein unverbindliches Angebot. Der Auftrag startet erst, wenn Sie Muhammad Moizs Angebot annehmen.

In Teilzeit, Abends & am Wochenende

Angebot einholen Es handelt sich um ein unverbindliches Angebot. Der Auftrag startet erst, wenn Sie Muhammad Moizs Angebot annehmen.

Standort und Mobilität

Standort
Berlin, Deutschland
Ist bereit bei Ihnen im Büro zu arbeiten, in
  • Berlin und im Umkreis von 50km
  • Berlin
  • Hannover
  • Hamburg
  • Frankfurt am Main
  • Köln
  • Düsseldorf
  • Essen und im Umkreis von 50km
  • Nürnberg

Auftragspräferenzen

Auftragsdauer
  • 1 bis 3 Monate
  • 3 bis 6 Monate
  • ≥ 6 Monate

Checkliste

Malt Freelancer Charta unterzeichnet
Die Charta lesen

Geprüfte E-Mail-Adresse

Sprachen

  • Deutsch

    Konversationssicher

  • Englisch

    Fließend/Muttersprache

  • Deutsch

    Verhandlungssicher

  • Englisch

    Verhandlungssicher

Kategorien

Fähigkeiten (11)

  • Databases
  • ETL
    Berufseinsteiger Fortgeschritten Experte
  • BigData
  • Berufseinsteiger Fortgeschritten Experte
  • Alle
  • Berufseinsteiger Fortgeschritten Experte
  • Berufseinsteiger Fortgeschritten Experte
  • Berufseinsteiger Fortgeschritten Experte
  • Berufseinsteiger Fortgeschritten Experte
  • Berufseinsteiger Fortgeschritten Experte
  • Berufseinsteiger Fortgeschritten Experte

Muhammad Moiz in wenigen Worten

ich habe uber 10 Jahre Erfahrung in Bereich Business Intelligence und Data Warehousing. In letzten fünf Jahre habe ich Intensive mit Oracle, Exasol und MicroStrategy gearbeitet. in Bereich Big Data habe ich GDPR willig Data Lakes mit Cloudera CDH (avro, Hive/Impala) und AWS (S3, Athena/Redshift) durchgeführt. habe auch Data Ingenuere Erfahrung mit Python 3.7, und Nifi.

Projekt- und Berufserfahrung

Beratungshaus

Digitalagenturen & IT-Consulting

Data Ingenieur  - Als Freelancer

Heidelberg, Deutschland

November 2019 - Februar 2020 (2 monate)

Development of complete Data Engineering solution using Python, Nifi, Liquibase and MySQL. The Data Engineering solution was the backbone component of a new web based software for their client. It enabled acquisition and integration of source data. This then enabled provisioning of data in the new web solution.

Data Ingestion developed in Nifi, following layer by layer Data Ingestion design:

 Design used a Process Group for each data layer.
 Process Groups were placed and scheduled in a main Nifi Flow.
 Certain Process Groups were connected to execute in parallel!
 Use of Invoke Http and Get File Processors, for initial ingestion.
 Monitoring execution of Processors and Flow files.
 Integration of Python with Nifi over Nifi-Api (Nifi Rest API).
 Programmatic access (using Python) through Nifi Flow, Process Groups and Processors.
 Accessing all Processors and their attributes (in Python) via respective ProcessGroupId!

Further developments in Python:

 Metadata handling component. This component handles for target MySql database data type conversions, against Big Data AVRO primitive data types based meta data.
 Component to download large CSV files over Rest API with Streams (use of requests iter_content) in Parallel threads.
 Loading of CSV files into MySql using Data Load Infile command over Sqlalchemy(+pymysql)
 Data Integration Job.
 A main python job that combines other components together.

Python Libraries:
requests, Sqlalchemy, multiprocessing, pandas, dotenv, logging
Python Linux Docker PyCharm Liquibase Putty Real VNC Citrix MySQL

Versicherungs Branche

Banken & Versicherungen

Data Vault Data Lake Consultant  - Als Freelancer

Köln

April 2019 - Juli 2019 (3 monate)

A concept (POC) Big Data project with Data Vault 2.0 Modelling, using AWS, Snowflake and Cloudera (CDH) Cloud solutions! High level overview is as below:

 Development and Modelling of Business Entities into Data Vault 2.0 Entities.
 Implementation of Data Vault modelled entities into Cloud Data Lakes.
 Implementation of Data Load mechanism from Source ORACLE database, using Python.
 Delivery of Information Marts in all 3 Clouds.

The project was implemented into 3 Cloud Solutions, to give an accurate evaluation and documentation to the client. Technicalities covered were below:

 Use of AVRO file formats for storing data in Data Lakes (S3 data lake in AWS, HDFS in CDH).
 In HDFS based Cloudera Data Lake, SQL developments were done using Hive and Impala.
 Encryption of Customer personal data while loading into Data Lake - GDPR Compliance! (CDH only)
 Use of Athena/Redshift for Data Vault SQL developments on top of AWS S3 Data Lake.
 Use of Snowflake for Data Vault SQL developments, on top of AWS S3 Data Lake.
 Configured Snowpipes on top of S3 data lake, to automate data loading into Snowflake. To setup Snowpipe, AWS SQS was used.
 Initial Load in Snowflake was done using COPY command.
 Information Marts (both virtualized and materialized) from Snowflake, Redshift and Impala (CDH) were successfully connected witth Tableau, showing accurate / same KPIs in Dashboards.

A feature of Data Lake loading was Pre-computation of Hash keys, which were Materialised as AVRO files in the Lake.

Python Libraries: Hashlib, cx_Oracle, multiprocessing, pyarrow, psycopg2, uuid, Crypto.Cipher, Json, boto3, fastavro, BytesIO
Python Data Vault Cloudera AWS Hive Impala Redshift S3 Snowflake SQS Avro

Telekom Lösungsanbieter

Telekommunikation

Middleware Specialist  - Als Freelancer

Berlin, Deutschland

September 2018 - August 2019 (11 monate)

Middleware Specialist for Data integration and exchanges between internal and external systems and sources.

 Rest APIs and Gateways with Json, Xmls, Idocs and Javascript.

 Data structure / model analysis between SAP, Salesforce and real-time micro-services and respective data mapping.

 Development in MS SQL Server 2014 SSMS

 SQL development and stored procedure development with Transaction Management and Exception Handling.

 Test Evidence and UAT documentation.

Stack: MSSQL Server 2014, IBM AppConnect (Middleware), JSON/XML, CSV
Middleware SQL Server API REST IDOC JSON Javascript IBM App Connect

International Retail

Einzelhandel

MicroStrategy Frontend Entwickler  - Als Freelancer

Ruhrgebiet

August 2017 - März 2018 (7 monate)

Part of FE team, responsible for implementation of MicroStrategy Use Cases for the retail business.

 Business Validation of Requirements, with RE & Arch. team.
 Solution Concept Workshops, with Arch. & Business teams.
 Implementation of MicroStrategy Use Cases (package 2 & 3).
 Liaising between Backend and Frontend teams.

Implementierung der MicroStategy-Anforderungen

 Erstellung, Anpassung, Erweiterung der notwendigen Schemaobjekte, wie:
o Mapping of Attributes (IDs, Forms).
o Parent-Child relationships & Hierarchies.
 Report-Objekte
o Datasets mit Level und Derived Metriken
o Berichte, Filter, Prompts
 Documents
o Use of Panel Stacks, Selectors, Grids and Graph components.
o Use of Multiple Datasets.
 Visual Insight-Dashboards
 Transaction Services
 Multiple Datasets, -marts
 Job-Priorisierungen
 Cube Optimierung & incremental refresh reports
 Deployment (2-wöchentlich)
MicroStrategy Visual Insights iCube Level Metric Datasets Document

FinTech Firm

Banken & Versicherungen

Head of Data Technology

Berlin, Deutschland

September 2015 - September 2017 (2 jahre)

Part of Management Team, responsible for leading BI and Analytics function.

Rolled out MicroStrategy 10.

MicroStrategy Use-cases, built with Visual Insights:
 Investor Fact sheets and pitch-decks.
 Financial Metrics. (some of the most important KPIs of FinTech industry!)
IRR calculations
Annualized Net Returns (unadjusted) and
Use of Probabilities of Defaults and related maths.
 Performance Marketing per Channel Dashboards & reports
 Transaction Services Document, exposed in VI, to capture Marketing costs, directly in DWH (through MSTR)
 Customer Insights for Operations and CC team.
(a package of multiple dashboards & visualisations for CC team & head of Ops.)
 Payment processing and overdue related KPIs.
 Portfolio Performance Forecast.
 User Attributes & correlations.

MicroStrategy Windows Server (@AWS Cloud) Administrations.

Achievements include:
 Managed Self Service BI (via MicroStrategy).
 Successful closing of audits, with a positive opinion.
Head of BI Finance KPIs MicroStrategy Visual Insights Performance Marketing BI

Oleksandr Erm - Crosslend GmbH

8.1.2021

I was leading a software development team at Crosslend GmbH when I had a pleasure to work with Muhammad Moiz Ahmed. We were frequently working together as software teams and data teams are always close. He bears deep knowledge of data handling and data analytics techniques. Analytics tools that he has built have driven business decisions for years to come. His team was a gold standard when it came to the quality of data provided. Do you need the right data for your task? He has it. Personally, it was a pleasure working with him. He is kind of person you can have a well-argued discussion with on many topics inside or outside of his primary skill set. I'm looking forward to work on more projects with him.

Zertifizierungen