[1] (Undergraduate Degree) (2015/09 - 2019/06)
Hong Kong University of Science and Technology: Double major in Computer Science and Mathematics.
[2] (Special Student (Taking Classes + Doing Research)) (2018/02 - 2018/07)
Massachusetts Institute of Technology: Computer Science (Course 6).
[3] (Visiting Student (Research Internship)) (2017/06 - 2017/08)
Massachusetts Institute of Technology: Any Scale Learning for All (ALFA) Group, Computer Science and Artificial Intelligence Laboratory (CSAIL).
[4] (High School Student) (2012/09 - 2015/06)
High School Affiliated to Renmin University of China.
Personal Homepage (Complete Version)
[1] (Accepted) Mucong Ding, Yanbang Wang, Erik Hemberg, and Una-May O'Reilly. 2019. Transfer Learning using Representation Learning in Massive Open Online Courses. In Proceedings of International Learning Analytics and Knowledge Conference (LAK'19). ACM, New York, NY, USA, 10 pages.
[2] (Accepted) Mucong Ding, Kai Yang, Dit-Yan Yeung, and Ting-Chuen Pong. 2019. Effective Feature Learning with Unsupervised Learning for Improving the Predictive Models in Massive Open Online Courses. In Proceedings of International Learning Analytics and Knowledge Conference (LAK'19). ACM, New York, NY, USA, 10 pages.
[3] (Published) Mu Cong Ding and Kwok Yip Szeto. 2017. Selection of Random Walkers that Optimizes the Global Mean First Passage Time for Search in Complex Networks. In Proceedings of International Conference on Computational Science (ICCS'17). Procedia Computer Science, Zurich, Switzerland, 5 pages.
[4] (In preparation) Mucong Ding, Erik Hemberg, and Una-May O'Reilly. 2018. MOOC Learner Data Science Analytics. Preparing to submit to the 12th Educational Data Mining (EDM'19), Montreal, Canada.
[5] (In preparation) Mu Cong Ding and Kwok Yip Szeto. 2018. First-passage time distribution for random walks on complex networks using inverse Laplace transform and mean-field approximation. Preparing to submit to Physical Review E.
[6] (Thesis) Mucong Ding, Sirui Lu, and Zhiwei Ding. 2018. Adversarial Generation and Perturbation Elimination with GANs. 13 pages.
[7] (Thesis) Mucong Ding and Erik Hemberg. 2017. Observing and Understanding the Video Watching Behavior in Online Lectures. 17 pages.
[8] (Poster) Mucong Ding, Erik Hemberg, and Una-May O'Reilly. 2018. MOOC-Learner-Project Overview. In the annual meeting of HKUST MIT Research Alliance Consortium, 2018.
[1] (Core developer) MOOC-Learner-Project (MLP): taps the potential of Massive Open Online Course student behavioral data by providing data science technology that makes the data accessible for teaching and learning research. It enables insights into how students learn and how instructors can effectively teach.
[2] (Lead developer and Maintainer) MOOC-Learner Data Science Analytics (MLDSA): an end-to-end solution (processing + modeling + predicting) for MOOC data analytics which provides an easy-to-use and research-friendly interface for interactive data science, and supports online and at-scale working.
[3] (Lead developer and Maintainer) MOOC-Learner-Visualized (MLV): a platform which plots interactive and static figures for learning analytics based on the proposed features. A powerful tool for educational scientists to initiate ideas and examine theories.
[4] (Core developer and Maintainer) MOOC-Learner-Quantified (MLQ): quantifies the MOOC learner behavior as longitudinal features.
[5] (Maintainer) MOOC-Learner-Curated (MLC): translates and curates activities captured from a MOOC learner into a relational database.
[6] (Lead developer) MOOC-Learner-Modeled (MLM): serves as an interface to train and test all kinds of classifier models on all possible set of user longitudinal features, and transfer models among weeks and courses.
[7] (Lead developer and Maintainer) MOOC-Learner-Docker (MLD): connects all MOOC-Learner-Project pipelines (MLC, MLQ, MLV, MLM) and embeds them into Docker containers with a unified configuration.
[8] (Personal Project) Rendering Thin Film Interference on Soap Bubbles: renders the real-life looking of soap bubbles using WebGL. Implements an approximated thin-film interference formula, and simulates the film-thickness distribution on the surface considering the drifting and sloshing effects.
[9] (Personal Project) In Browser Demo of Variational Auto-Encoders: demonstrates the inference process of variational auto-encoders on MNIST digits using Keras.js.