We have a robot, we are using a servo with a parallax ir sensor mounted on top which sweeps 180 degrees and populates an array with the range it sees on a microcontroller. This array of ranges containes enough data to extract left and right edges of object in the room. I want to make a simulation using pygame that will simulate data so that hypobots (if you don't know what this is its just statistical hypothetical robot positions) can match to it.
Python: (using library pygame)
I want to make an array. (Simulated on a virtual map)
The array will hold one value per degree from 0 to 180
The value it holds is the distance vector to the object in the corresponding direction
The vector should be of length equal to the distance from the origin (later to be a variable location) to the first object it sees at a respective angle.
For example, if there is an object of width 3, And is in front of us, we will see an array of 9,9,9,9,9,1,1,1,9,9,9,9,9,9
The 1s are the near distance if the object, the 9's are the distant background
The core of it:
How should I go about drawing a vector that terminates as soon as it hits on object?
-communities like yours are a godsend to people like me, thank you so much for taking the time to go down this path with me
please ask any questions because I know there are many losses in communication between electrical and software engineering.