Privacy-aware human mobility modeling

    Type: Master
    Status: Assigned February 2021
    Student: Jose Castro Elizondo

    Human mobility can be modeled using smartphone data, however, such data represents sensitive information, and the collection of those threatens the privacy of the users involved. The recent introduction of federated learning, a privacy-preserving approach to build machine and deep learning models, represents a promising technique to solve the privacy issue.

    In the context of this MSc thesis or UROP project, you will: (i) create an overview of privacy-aware human mobility modeling approaches; (ii) pre-preprocess/normalize datasets to a common format (e.g., [1][2]); ; (iii) create advanced privacy-aware models for human mobility modeling (e.g., Transformer networks [8][9][10] using federated learning [14]); (iv) compare the results to existing baselines (e.g., baselines based on RNNs [5],[6]); (v) summarize the results and propose future work.


    1. Mokhtar, Sonia Ben, Antoine Boutet, Louafi Bouzouina, Patrick Bonnel, Olivier Brette, Lionel Brunie, Mathieu Cunche et al. “PRIVA’MOV: Analysing Human Mobility Through Multi-Sensor Datasets.” 2017 []
    2. Moro, Arielle, Vaibhav Kulkarni, Pierre-Adrien Ghiringhelli, Bertil Chapuis, and Benoit Garbinato. “Breadcrumbs: A Feature Rich Mobility Dataset with Point of Interest Annotation.” arXiv preprint arXiv:1906.12322 (2019) []
    3. Luca, Massimiliano, Gianni Barlacchi, Bruno Lepri, and Luca Pappalardo. “Deep Learning for Human Mobility: a Survey on Data and Models.” arXiv preprint arXiv:2012.02825 (2020). []
    4. Feng, Jie, Zeyu Yang, Fengli Xu, Haisu Yu, Mudan Wang, and Yong Li. “Learning to Simulate Human Mobility.” In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 3426-3433. 2020.
    5. Feng, Jie, Yong Li, Chao Zhang, Funing Sun, Fanchao Meng, Ang Guo, and Depeng Jin. “Deepmove: Predicting human mobility with attentional recurrent networks.” In Proceedings of the 2018 world wide web conference, pp. 1459-1468. 2018. []
    6. Yu, Lantao, Weinan Zhang, Jun Wang, and Yong Yu. “Seqgan: Sequence generative adversarial nets with policy gradient.” In Thirty-first AAAI conference on artificial intelligence. 2017. []
    7. scikit-mobility: mobility analysis in Python
    8. Brown, Tom B., Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan et al. “Language models are few-shot learners.” arXiv preprint arXiv:2005.14165 (2020). []
    9. Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. “Attention is all you need.” In Advances in neural information processing systems, pp. 5998-6008. 2017. []
    10. Devlin, Jacob, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. “Bert: Pre-training of deep bidirectional transformers for language understanding.” arXiv preprint arXiv:1810.04805 (2018). [—————————]
    11. “Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing”. Google AI Blog. Retrieved 2019-11-27.

    For more information contact: