# Course talk:CPSC522/FastSLAM

- [View source↑]
- [History↑]

## Contents

Thread title | Replies | Last modified |
---|---|---|

Feedback (F3) | 0 | 18:49, 19 March 2019 |

Feedback from F5 | 1 | 00:33, 19 March 2019 |

I thought the discussion of FastSLAM was good. My only comment is that it would be helpful to specify whether we know which landmark is observed at each step? From the paper, I think we do but I couldn't get this from the article.

To help better show the contribution of paper 2, it could be helpful to emphasise where paper 1 fits into paper 2 a little bit more - like when FastSLAM is the best method and when it is better to use another method from the survey paper. e.g. presumably if the Gaussian assumptions are unreasonable then there is another approach we should use.

Overall:

- Just a few more sentences about the implementation, e.g., the probability representations, data structures and computational complexity, would make the review significantly more insightful.
- You are hinting at, but not explicitly mentioning, the use of EKF in FastSLAM (until much later in the review section).
- The second paper is a review, which makes it less well suited to the assignment than a paper which, say, proposes a new technique which can be directly compared to FastSLAM.
- Naturally, the second part of your article therefore is much less technical than the first, and is better suited as an overview of the field in the beginning of your article. You might consider either going into slightly more detail regarding one of the techniques, or changing your second paper.
- A few sentences about experiments would be great, e.g., computational cost (number of parameters, particles, observations), comparison to other work.

Notation:

- Equation blocks are hard to read. Better alignment would probably help.
- Your equation comments have removed important information from the formulation in the paper, without adding simpler explanations. I found the original comments like "Markov property" and various particle filter approximations more informative than the "independence" comment you provide.
- You are reproducing the paper's notation, but you might consider cleaning it up a bit. For example, their use of superscript and subscript is idiosyncratic: in mathematics, usually [m] = {1,..., m} and not [m] = m. Also, current common notation for <x
_{1},..., x_{t}> is x_{1:t}, not x^{t}. - Maybe you could summarize the description of symbols (which you already have in text form) into a table, and also include descriptions for all important probabilities. For example, I had to look into the paper to understand that p(\theta
_{n_t}^{[m]}) is a Gaussian posterior.

Thank you for the feedback,

About EKF(or technical details), I don't know what they are so I avoided talking about them.

I couldn't find any good paper on the topic, and the review was not satisfying either.

About notation,

The suggestion about cleaning up the notation is good, if I have time I will take a look into it.