Skip to content

Commit e677c80

Browse files
committed
Delete all other content in home
Signed-off-by: Krishna Murthy <krrish94@gmail.com>
1 parent 6af553c commit e677c80

File tree

1 file changed

+1
-125
lines changed

1 file changed

+1
-125
lines changed

docs/_pages/home.md

+1-125
Original file line numberDiff line numberDiff line change
@@ -8,128 +8,4 @@ permalink: /
88

99
# Page deprecated
1010

11-
> <span style="font-size:0.7em;"> You are being redirected to our new homepage: [https://gradslam.github.io](https://gradslam.github.io) for more details. </span>
12-
13-
14-
### Abstract
15-
16-
> <span style="font-size:0.7em;"> The question of "representation" is central in the context of dense simultaneous localization and mapping (SLAM). Newer learning-based approaches have the potential to leverage data or task performance to directly inform the choice of representation. However, learning representations for SLAM has been an open question, because traditional SLAM systems are not end-to-end differentiable. </span>
17-
> <span style="font-size:0.7em;">In this work, we present gradSLAM, a differentiable computational graph take on SLAM. Leveraging the automatic differentiation capabilities of computational graphs, gradSLAM enables the design of SLAM systems that allow for gradient-based learning across each of their components, or the system as a whole.
18-
This is achieved by creating differentiable alternatives for each non-differentiable component in a typical dense SLAM system. Specifically, we demonstrate how to design differentiable trust-region optimizers, surface measurement and fusion schemes, as well as differentiate over rays, without sacrificing performance. This amalgamation of dense SLAM with computational graphs enables us to backprop all the way from 3D maps to 2D pixels, opening up new possibilities in gradient-based learning for SLAM. </span>
19-
20-
21-
### Explanatory video
22-
23-
<div style="text-align:center">
24-
25-
Click on the image below, to watch a video that explains what gradSLAM is, and showcases our primary results.
26-
27-
<br/> <br/>
28-
29-
<a href="http://www.youtube.com/watch?feature=player_embedded&v=2ygtSJTmo08
30-
" target="_blank"><img src="http://img.youtube.com/vi/2ygtSJTmo08/0.jpg"
31-
alt="Click to watch the explanatory video."/></a>
32-
33-
<!-- [![Watch an explanatory video on gradSLAM](http://img.youtube.com/vi/2ygtSJTmo08/0.jpg)](http://www.youtube.com/watch?v=2ygtSJTmo08) -->
34-
35-
</div>
36-
37-
<!-- <div style="text-align:center">
38-
39-
<img src="{{ site.baseurl }}/images/icpslam.gif" />
40-
41-
</div> -->
42-
43-
44-
### About gradSLAM
45-
46-
gradSLAM is a fully differentiable dense SLAM framework. It provides a repository of differentiable building blocks for a dense SLAM system, such as differentiable nonlinear least squares solvers, differentiable ICP (iterative closest point) techniques, differentiable raycasting modules, and differentiable mapping/fusion blocks. One can use these blocks to construct SLAM systems that allow gradients to flow all the way from the outputs of the system (map, trajectory) to the inputs (raw color/depth images, parameters, calibration, etc.).
47-
48-
Specifically, we implement differentiable versions three classical dense SLAM systems using the gradSLAM framework: KinectFusion, PointFusion, ICP-SLAM. We choose these because
49-
* Each of these approaches sparked a new line-of-research in dense SLAM
50-
* The approaches themselves are fairly simple from an algorithmic standpoint
51-
* We aim to provide differentiable SLAM solutions for a wide variety of 3D representations (voxels, surfels, points).
52-
53-
<div style="text-align:center">
54-
55-
<img src="{{ site.baseurl }}/images/header.png" />
56-
57-
</div>
58-
59-
### Qualitative results
60-
61-
The differentiable SLAM systems perform quite similarly to the non-differentiable counterparts, while allowing for gradients to flow right through. This makes gradSLAM attractively poised to be used in gradient-based learning systems.
62-
63-
Here's a comparative analysis of the three differentiable SLAM systems we provide.
64-
65-
<div style="text-align:center">
66-
67-
<img src="{{ site.baseurl }}/images/icl.png" />
68-
69-
</div>
70-
71-
More qualitative results, this time, on the ScanNet dataset.
72-
73-
<div style="text-align:center">
74-
75-
<img src="{{ site.baseurl }}/images/scannet.png" />
76-
77-
</div>
78-
79-
Further, we have evaluated gradSLAM on the TUM RGB-D benchmark, as well as on an in-house sequence captured from an Intel RealSense D435 camera.
80-
81-
<div style="text-align:center">
82-
<img src="{{ site.baseurl }}/images/tum.gif" />
83-
</div>
84-
<div style="text-align:center">
85-
<img src="{{ site.baseurl }}/images/mrsd.gif" />
86-
</div>
87-
88-
89-
### Paper
90-
91-
<div style="text-align:center">
92-
93-
A preprint is currently available on arXiv, at the following link. Click on the thumbnail below, to access.
94-
95-
<br/> <br/>
96-
<a href="{{ site.baseurl }}/paper.pdf" target="_blank"><img src="{{ site.baseurl }}/images/paper.png"
97-
alt="Click to access the preprint."/></a>
98-
</div>
99-
100-
<b>The paper has been accepted for publication at the international conference on robotics and automation (ICRA), 2020. </b>
101-
102-
103-
### Citing us
104-
105-
If you would like to cite us, you could use the following BibTeX entry.
106-
107-
```
108-
@article{gradslam,
109-
author = { Krishna Murthy, J. and Iyer, Ganesh and Paull, Liam },
110-
title = { gradSLAM: Dense SLAM meets Automatic Differentiation },
111-
journal = { ICRA (to appear) },
112-
year = { 2020 },
113-
}
114-
```
115-
116-
117-
### Code
118-
119-
gradSLAM will be released as an open-source SLAM framework for PyTorch.
120-
<s> As the manuscript is currently under review, we are eyeing a release date in the latter half of January 2020. </s>
121-
We are working on a full code release, and it's taking longer than expected. Estimated release month: April.
122-
123-
<span style="color:blue">To be notified when code is released, one can track <a href="https://github.com/montrealrobotics/gradSLAM">this GitHub repository</a>. </span>
124-
125-
### Correspondence
126-
127-
If you need to discuss any further, or seek clarifications on implementation detail, correspondence is encouraged to [Krishna Murthy](https://krrish94.github.io), [Ganesh Iyer](https://epiception.github.io/), or [Liam Paull](http://liampaull.ca). Also, we would like to hear more from anyone working on similar ideas.
128-
129-
### More qualitative results
130-
131-
Stay tuned for more gifs!!!
132-
133-
### Acknowledgements
134-
135-
The authors are grateful to the wonderful help from several people, including, but not limited to, [Gunshi Gupta](https://gunshi.github.io), [Ankur Handa](https://ankurhanda.github.io/), [Bhairav Mehta](https://bhairavmehta95.github.io), [Mark Van der Merwe](https://mvandermerwe.github.io/), [Aaditya Saraiya](https://www.ri.cmu.edu/ri-people/aaditya-saraiya/), [Parv Parkhiya](https://www.ri.cmu.edu/ri-people/parv-parkhiya/), [Akshit Gandhi](https://www.ri.cmu.edu/ri-people/akshit-kishor-gandhi/), [Shubham Garg](https://www.ri.cmu.edu/ri-people/shubham-garg/), [Tejas Khot](https://tejaskhot.github.io), [Gautham Swaminathan](https://www.ri.cmu.edu/ri-people/swaminathan-gurumurthy/), [Zeeshan Zia](http://zeeshanzia.com), [Ronald Clark](http://ronnieclark.co.uk), and [Sajad Saeedi](https://www.sajad-saeedi.ca/).
11+
> <span style="font-size:0.7em;"> You are being redirected to our new homepage: [https://gradslam.github.io](https://gradslam.github.io)</span>

0 commit comments

Comments
 (0)