Revision 0f7c7569

View differences:

VRAR_Readme.txt
1

  
2

  
3
This document, tells you what is contained in this repository and how you are able to use it to
4
 - create your one videos (Section B)
5
 - render a video (Section A)
6
 - create ground-truth-data (Section B)
7
 - test a library (Section C)
8
 - evaluate the results (Section D)
9

  
10
## A Render the Videos ##
11

  
12
1. Install and open Blender (Version 2.72a)
13
2. Download and open one of the .blend-Files
14
3. Browse to the "Render"-Tab and set the resolution you want to render
15
4. Set the path ("Output")
16
5. Click on "Render" and wait for the rendering to complete
17

  
18

  
19
## B Make changes to our videos or create your own videos and create ground truth data ##
20

  
21
1.  Install and open Blender (Version 2.72a)
22
2.  If you want to use your markers, put them in one of the Folders "BART" or "ALVAR"
23
4.  Download our plugin for Blender ("Testing" state, not supported by Blenders community)
24
5.  Install the Plugin, by copying it to a subdirectory of your plugin-folder.
25
6.  Download and open any of our .blend-Files you want to change, or start a new one.
26
7.  Make your modifications and click one of the Buttons "BART" or "ALVAR" to reload the markers.
27
  - If you want to create new markers use the button "Marker erstellen"
28
  - You can make use of the operator "Marker randomisieren" which switches the Texture of the markers 12 times a second.
29
8.  Browse to the "Render"-Tab and set the resolution you want to render
30
9.  Set the path ("Output")
31
10. Click on "Render" and wait for the rendering to complete
32
11. Insert a path for the ground-truth-data in the text-field below "Positionsdaten ausgeben".
33
    NOTE: .txt and .csv-files with the same name as given in this step may be overwritten.
34
12. Create the ground-truth-data by clicking on "Ausgeben".
35
    You always get an .csv and a .txt which is more descriptive.
36

  
37
## C Test a library ##
38

  
39
Find out how you can access the library you want to test and write a connection. If you want to use our scripts for evaluation, you are required to write a csv-file with the following format:
40

  
41
frame-number,r1,r2,r3,r4,r5,r6,r7,r8,r9,t1,t2,t3,scale,distance
42

  
43
r1 to r9 should give an rotation-matrix and t1 to t3 should give a translation-vector.
44

  
45
## D Evaluate the Results ##
46

  
47
Regarding to your goal, you need to do your own evaluation of the data given by your library(s) together with the ground-truth-data.
48
If you want to use some of the scripts we created for our work, you should have a look on the following list of scripts:
49

  
50
- evaluation/genauigkeit_berechnen.c: This file calculates the detection rate and the accuracy of position-detection
51
- R/rotm.r: Programm to calculate the accuracy of rotation-detection, and paint plots from this data.
52
- R/transl.r: Creates plots from the accuracy of position-detection, calculated in evaluation/genauigkeit_berechnen.c. 
53
- R/histogramm_markernummern.r: Creates a histogramm of the detection over the associated marker-ids.
54
- R/main2.r: Converts rotation-accuracy based on rotation-matrices to rotation-accuracy, based on Euler angels.
55
     - R/fit.r: Helper, that fits the conversion for one matric.
56
- R/share/hauptdatensaetze_laden.r: A shared script, that loads the necessary files.
57
- R/Bestimmung_Umformungen_Rotm: This folder contains matrices used to adjust rotation-matrices from BART, ALVAR and ARUCO. You may need to do this for your library, if the format of the rotation-matrices contained in r1 to r9 (see Section C) differs from each of the librarys.
58
- R/benchmark.r: This file contains a primitive Benchmark, that compares detection rates.
59

  
60

  

Also available in: Unified diff