My research interest lies at the intersection of Machine Learning and Semantic Web. Specifically, my focus is to use semantic web technologies to explain the automated decision made by black box AI systems in a human-understandable way. By combining semantic background knowledge and statistical methods, I am trying to create human understandable explanation from the excellent but opaque decision achieved by machine learning algorithms. Previously I have worked to make ontology editing easier and for that, I have developed 2 plugins (OWLAx and ROWL) for linked data analysis.
Along with my PhD I did summer internships at Intel Corporation and Accenture Technology Lab. At Intel, I worked to analyze the performance bottleneck of the TensorFlow framework on Intel CPU and make the framework scalable (multi-node). As a research intern at Accenture for summer-2017, I analyzed various explainable AI (XAI) methods to produce human-understandable explanation.
Before starting my PhD, I have worked at Samsung Electronics for 2 years as a software engineer, where I worked on the connectivity module of Android OS.
I have MS in Computer science and have achieved Big and Smart Data Graduate certificate from Wright State University. I completed Deep Learning Specialization on Coursera. Certificate link: https://www.coursera.org/account/accomplishments/specialization/39UJH6YRBFZN
Research Interests: Deep Learning, Semantic Web, Linked Data
Current Research Projects:
Analyze Data by connecting it with the context makes data more interpretable and valuable, rather than analyzing it as a discrete item. We can connect data with the knowledge graph to make more sense from it, and guide the data exploration to identify extra information. I am building methods and tools to make data analysis based on the knowledge graph.
Deep learning algorithms are performing very good and sometimes they outperform humans. But still, their decision-making process is opaque. This is important for safety-critical operations like medical diagnostics, emergency response etc. We are trying to explain the decision of the algorithm. Our results Explaining Trained Neural Networks with Semantic Web Technologies: First Steps, Relating Input Concepts to Convolutional Neural Network Decisions are being published in AAAI conference, NeSy and NIPS workshops.
3. Ontology Axiomatization:
Ontology Design Pattern Plugin for Desktop Protege
Accepted as a Software Demo at the 15th International Semantic Web Conference, ISWC2016, Kobe, Japan, October 2016: Md. Kamruzzaman Sarker, Adila A. Krisnadhi and Pascal Hitzler, OWLAx: A Protege Plugin to Support Ontology Axiomatization through Diagramming
Rule to OWL Axiom Conversion Plugin for Protege
Accepted as a Software Demo at the 15th International Semantic Web Conference, ISWC2016, Kobe, Japan, October 2016: Md. Kamruzzaman Sarker, David Carral, Adila A. Krisnadhi and Pascal Hitzler, Modeling OWL with Rules: The ROWL Protege Plugin.
Analyzed and optimized the scalability of key deep learning (DL) algorithms implemented using Tensorflow on Intel Xeon CPU. Created containerized solution for intel optimized TensorFlow using Docker and Singularity.
Insurance company especially health insurance company need to calculate the cost of a new customer. Currently, they take user information and calculate the cost manually. We tried to automate this process using deep learning algorithms.
3. Samsung Android:
Worked on Android Connectivity team. Solved more than 100 issues related to Wi-Fi connectivity of 30+ Samsung Android Devices.
Developed music listener and sync with Android devices for Samsung Gear. Developed a decryption method for I-Cloud contact being used in Smart Switch Migration. All code was reviewed, perfected, and pushed to production.
Here are my Online Profiles:
Google Scholar: https://scholar.google.com/citations?user=dnySX2QAAAAJ&hl=en