I am a PhD student in School of Computing Science at Simon Fraser University supervised by Prof. Yasutaka Furukawa. My research interests lie in Computer Vision, Machine Learning and Robotics, with a particular focus on indoor localization and tracking via multi-modal data fusion. During my PhD, I have done research internships at Facebook Reality Labs, Apple (TDG) and Samsung Research (SAIC-Montreal).
I obtained MSc in Computing Science from Simon Fraser University supervised by Prof. Furukawa and BSc in Computer Science and Engineering from University of Moratuwa.
General chair of Women in Computer Vision (WiCV) workshop at CVPR 2024
June. 2024 Seattle, USA
Co-organizing Women in Computer Vision (WiCV) workshop at CVPR 2023
June. 2023 Vancouver, Canada
Poster presentation at 2023 CRA-W Grad Cohort for Women
April 2023 San Francisco, CA
Jan. 2023- April 2023
Sunnyvale, CA
Presented my Research at ACM CAN-CWiC 2022
October 2022 Toronto, ON
Presented NILoc at CVPR 2022
October 2022 Toronto, ON
Talent Bursary Recipient for Amii AI week 2022
May 2022 Edmonton, Alberta
Jan. 2020 - Present
Jan. 2018- Nov. 2019
May. 2018- Dec 2019
Attended ICCV 2019
Sept. 2019 Seoul, South Korea
Project Mentor: Invent the Future (AI4ALL - SFU) - Robotics and Sensing team.
July 2019 website
Workshop Leader in Try/Catch 2019, Data Science Workshop.
May 2019 website
Attended 2019 CRA-W Grad Cohort for Women
April 2019 Chicago, USA
Program Mentor: Invent the Future (AI4ALL - SFU)
July 2018
Attended CVPR 2018
June. 2018 Salt Lake City, USA
Oct. 2016- Dec 2017
Apr. 2016- Sept. 2016
Apr. 2016- Dec. 2016
2011-2016
First Class Honours
Dean's list in all semesters
2014 project: Linked Data Tool
Mentor: Stéphane Corlosquet
Attended 5th South-asia workshop on research frontiers organized by School of Computing, NUS.
May. 2015 Singapore website
2014-2015
Oct. 2014-Apr. 2015 website
2014 project: RDF UI
Mentor: Stéphane Corlosquet
1997-2010
National Rank : 8
Musaeus College: Best University Entrant 2010
Musaeus College: Marie Musaeus Higgins Award for Outstanding performance in Physical Science Stream-2010
This paper proposes the inertial localization problem, the task of estimating the absolute location from a sequence of inertial sensor measurements. We developed a solution, dubbed neural inertial localization NILoc which 1) uses a neural inertial navigation technique to turn inertial sensor history to a sequence of velocity vectors; then 2) employs a transformer-based neural architecture to find the device location from the sequence of velocities.
Publications:
Authors: Sachini Herath, David Caruso, Chen Liu, Yufan Chen, Yasutaka Furukawa
The paper proposes a multi-modal sensor fusion algorithm that fuses WiFi, IMU, and floorplan information to infer an accurate and dense location history in indoor environments. The algorithm uses 1) an inertial navigation algorithm to estimate a relative motion trajectory from IMU sensor data; 2) a WiFi-based localization API in industry to obtain positional constraints and geo-localize the trajectory; and 3) a convolutional neural network to refine the location history to be consistent with the floorplan.
Publications:
Authors: Sachini Herath, Saghar Irandoust, Bowen Chen, Yiming Qian, Pyojin Kim, Yasutaka Furukawa
MSc. Thesis Research Project. [Website] [Code]
The research focus on data-driven inertial navigation, where the task is the estimation of positions and orientations of a moving subject from a sequence of IMU sensor measurements. We present 1) a new benchmark of IMU sensor data and ground-truth 3D trajectories under natural human motions; 2) neural inertial navigation architectures, making significant improvements for challenging motion cases; and 3) comprehensive evaluations of the competing methods and datasets.
Publications:
Authors: Hang Yan*, Sachini Herath*, Prof. Yasutaka Furukawa (* indicates equal contribution)
Final year capstone project for BSc. Engineering degree. [Blog]
A content generation and rendering framework for freely navigable virtual reality environments. The environment was created using photospheres and intermediate views were generated using features from neighbouring images. Segway inspired gesture based input system was introduced for navigation.
Publications:
Authors: Sachini Herath, Vipula Dissanayake, Sanka Rasnayaka, Sachith Seneviratne, Rajith Vidanaarachchi, Dr. Chandana Gamage
Template credit: Colorlib