Badges
Certifications
Work Experience
Java Technical Architect
ITS•  December 2007 - Present
• Assigned to build Microservices architecture containing Eureka naming service, and API gateway for a new project then deploy it on a docker instance. • Experienced in creating docker images prepared with the project running environment that is used for development, testing, or UAT purpose. • Build architecture to migrate J2EE service-based applications to Microservices and certify the new projects on EAP7.4, Weblogic 14c • Charged leading developers to migrate traditional J2EE projects to Spring boot projects and Maven projects. • Assigned to design and develop a digital signature framework. • Participated in certifying J2EE backend application on Oracle Weblogic 14c, IBM Webspher8.5.2 and EAP7.4 application Servers. • Collaborated on all stages of the systems development lifecycle, including analysis, design, development, and testing. • Participated in building software GAP analysis documents for new customer requirements. • Participated in building software detailed design documents and coached developers in coding according to the design document. • Assigned to build Oracle Weblogic14c SSL cluster environment for implementing J2EE backend & frontend application. • Assigned to design and develop token verification framework to synchronize and transform frontend HTTP session to J2EE backend. • Assigned to design and develop audit trail framework and inject it vertically across all J2EE backend application modules. • Consulted with the project manager to determine system loads and develop improvement plans. • Collaborated with developers to identify performance bottlenecks on J2EE backend & frontend application. • Commissioned to lead twelve developers to migrate J2EE backend & frontend application from BEA 8 to Oracle Weblogic10gR3. • Involved in building research on OWASP top 10 security vulnerabilities and develop POC to implement this research on J2EE backend application. • Charged leading developers to develop and implement J2EE backend & frontend application to integrate with call center IVR system on customer site. • Assigned to deploy and configure JBPM 6.5 backend application on Docker. • Involved in performing functional consulting services and remotely implementing products and patches on customer sites. • Involved in upgrading JBPM 6.1 backend application to JBPM 6.5.
Bigdata Developer
Atos•  October 2020 - June 2023
• Experienced in all BigData journey from developing custom Kafka connector (Java) or using Community connectors to extract data from source systems (Firebase Analytics, MongoDB, SQL-Server, Oracle) and then developing Flink jobs (Java) to reformat data in a unique format (standard Debezium format) and finally push the data that extracted from different source systems to data warehouse for building reports. • Assigned to develop Kafka source Connector from scratch to extract data from MongoDB and write the results into a Kafka topic, then writing Flink job to consume the topic messages and write on HDFS. • Assigned to develop Kafka source Connector from scratch for executing custom queries on Google Cloud BigQuery and writing the results into a Kafka topic, then writing Flink job to consume the topic messages and write on HDFS. • Assigned to develop Kafka source Connector from scratch for reading data from Firebase Analytics V4 and writing the results into a Kafka topic, then writing Flink job to consume the topic messages and write on HDFS. • Assigned to build Firebase Kafka Connector for reading data from a Firebase real streaming and writing the contents into a Kafka topic. • Assigned to develop Flink jobs that consume data from Kafka topics • Assigned to build mapping on Source System to run ETLs to fill DWH; this data will be used to generate reports. • Assigned to develop Microservices, Eureka naming service, and API gateway for Microservices then deploy it on OpenShift using DevOps pipelines. • Participated in developing some jobs with Flink and PyFlink • Participated in designing and developing Big Data solutions, using Hadoop Ecosystems (HDFS, Flume, Sqoop, Apache PySpark, Java Map Reduce) • Have very good experience in coding with RDD, AVRO, and PARQUET using Hadoop Apache PySpark. • Participated in building a bank’s customer service chatbot with IBM Watson. • Assigned to build a Recommendation Model with Python over AWS. • Assigned to write docker files to replicate the customer environment locally with the same configurations.
Software Engineer
RHB Bank•  January 2019 - December 2020
• Assigned to develop Spring boot scheduled jobs. • Assigned to develop spring core module coded with JPA over hibernate that handles business queries on the database. • Involved in developing and fixing restful web services integrated with JPA over hibernate related to RHB bank. • Participated in developing RHB SME Business Loan application. • Assigned to develop security hashing and digital signature framework using RSA algorithm and injecting it vertically with java dependency injection annotations across all J2EE application modules that include backend modules, services modules, and multiple frontend modules (web and android application) • Joined in a microservices development training course to migrate all developed REST services to microservices architecture. • Assigned to develop a backend module coded with EJB over JPA that behaves as a framework to be used by other developers to develop any business logic on any J2EE application.
Education
Nile University
Data Science, B.Tech•  March 2016 - August 2017
• Detailed study of Machine Learning and Data Science. • Detailed study of Big Data processing with Java Map-reduce and Python-Spark under Apache • Hadoop Ecosystems. • Graduation Project: Inferring Networks Recommender System for Amazon products.
Cairo University
Computer Science, B.Sc•  September 1995 - May 1999
Studied • Computer Science. • Data Science. • System Analysis and Design. • Algorithms and Data Structure.
Links
Skills
ayman_m_yosry has not updated skills details yet.