Transformers for human mobility modeling

    Type: Bachelor Master UROP
    Status: Available

    In this BSc project you will implement a state-of-the-art model for human mobility modeling. The goal is to use Transformer networks (e.g., GPT-3, BERT), which are deep learning language/sequence modeling methods that have a potential to improve upon the previously used RNN- and LSTM-based methods. More specifically, in this project you will: (i) overview methods based on Transformer networks [8][9][10] with the aim to use them for human mobility modeling; (ii) use an existing dataset to train a Transformer network for human mobility modeling (e.g., next task prediction) [12][13]; (iii) analyze the behavior of the network for a variety of parameters (e.g., dataset size, embedding dimension, number of heads, etc.); (iv) summarize the results and propose future work.

    In the context of an MSc thesis or UROP project, you will develop state-of-the-art models for human mobility modeling. The goal is to use Transformer networks (e.g., GPT-3, BERT), which are deep learning language/sequence modeling methods that have a potential to improve upon the previously used RNN- and LSTM-based methods. More specifically, in this project you will: (i) overview methods based on Transformer networks [8][9][10] with the aim to use them for human mobility modeling; (ii) pre-preprocess two mobility datasets (e.g., [1][2]); build a novel Transformer network for human mobility modeling (e.g., next task prediction) [12][13]; (iii) compare the Transformer network to an existing baseline method for human mobility modeling  [5][6]; (iv) summarize the results and propose future work.

    References:

    1. Mokhtar, Sonia Ben, Antoine Boutet, Louafi Bouzouina, Patrick Bonnel, Olivier Brette, Lionel Brunie, Mathieu Cunche et al. “PRIVA’MOV: Analysing Human Mobility Through Multi-Sensor Datasets.” 2017 [https://hal.inria.fr/hal-01578557/document]
    2. Moro, Arielle, Vaibhav Kulkarni, Pierre-Adrien Ghiringhelli, Bertil Chapuis, and Benoit Garbinato. “Breadcrumbs: A Feature Rich Mobility Dataset with Point of Interest Annotation.” arXiv preprint arXiv:1906.12322 (2019) [https://arxiv.org/pdf/1906.12322.pdf]
    3. Luca, Massimiliano, Gianni Barlacchi, Bruno Lepri, and Luca Pappalardo. “Deep Learning for Human Mobility: a Survey on Data and Models.” arXiv preprint arXiv:2012.02825 (2020). [https://arxiv.org/pdf/2012.02825.pdf]
    4. Feng, Jie, Zeyu Yang, Fengli Xu, Haisu Yu, Mudan Wang, and Yong Li. “Learning to Simulate Human Mobility.” In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 3426-3433. 2020.
      [https://www.youtube.com/watch?v=sj4UCW0P6Ks&ab_channel=AssociationforComputingMachinery%28ACM%29]
    5. Feng, Jie, Yong Li, Chao Zhang, Funing Sun, Fanchao Meng, Ang Guo, and Depeng Jin. “Deepmove: Predicting human mobility with attentional recurrent networks.” In Proceedings of the 2018 world wide web conference, pp. 1459-1468. 2018. [https://github.com/vonfeng/DeepMove]
    6. Yu, Lantao, Weinan Zhang, Jun Wang, and Yong Yu. “Seqgan: Sequence generative adversarial nets with policy gradient.” In Thirty-first AAAI conference on artificial intelligence. 2017. [https://github.com/LantaoYu/SeqGAN]
    7. scikit-mobility: mobility analysis in Python
    8. Brown, Tom B., Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan et al. “Language models are few-shot learners.” arXiv preprint arXiv:2005.14165 (2020). [https://arxiv.org/abs/2005.14165]
    9. Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. “Attention is all you need.” In Advances in neural information processing systems, pp. 5998-6008. 2017. [https://arxiv.org/abs/1706.03762]
    10. Devlin, Jacob, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. “Bert: Pre-training of deep bidirectional transformers for language understanding.” arXiv preprint arXiv:1810.04805 (2018). [https://arxiv.org/pdf/1810.04805.pdf?source=post_elevate_sequence_page—————————]
    11. “Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing”. Google AI Blog. Retrieved 2019-11-27.
    12. https://www.tensorflow.org/tutorials/text/transformer
    13. https://keras.io/examples/nlp/text_classification_with_transformer/
    14. https://ai.googleblog.com/2017/04/federated-learning-collaborative.html
    15. https://heartbeat.fritz.ai/stylegans-use-machine-learning-to-generate-and-customize-realistic-images-c943388dc672

     

    For more information contact: