Hi everyone! I am glad to be a part of OpenFrameworks community, hopefully outside the beginner forum soon.
I am working on a multimedia project where IR tracker position is scanned with a camera and projected back to the same screen with a projector.
The problem I face at the moment is that tracker position taken by camera have to be calibrated to fit the projection size and position. I was planning to use simple a map (horizontal and vertical) function to map tracked coordinates to screen coordinates, but this will give a good result only when a camera is located right in front of the screen.
When a camera is in offset horizontally or vertically - the tracker position will be a trapezoid (where ab != cd on camera tracked position I), but probably even worse where none of the sides are equal.
I want the tracking and projection be as precise as possible, so i am looking for a way to correctly map the tracker position to the projected screen position. I can take 4 positions of the tracker, which will represent points a,b,c,d on screen I, but I don't know how to map them to points a,b,c,d on projected screen II.
There is a warpPerspective in OpenCV, which does similar operation to an image, so I need the same, but just to map the coordinates. There is probably an easy solution for this, I just don't know how to correctly search for it