Left to right: the steering wheel to control car’s rotation, the pyscript UI for the machine learning model to start and stop the car, the car
while True:
clock.tick()
img = sensor.snapshot()
for tag in img.find_apriltags():
img.draw_rectangle(tag.rect(), color=(255, 0, 0))
img.draw_cross(tag.cx(), tag.cy(), color=(0, 255, 0))
tag_id = tag.id()
tag_name = tag.family()
tag_rotation = (180 * tag.rotation()) / math.pi
currentTag = f'{tag_id}, {tag_name}'
print("Tag Family %s, Tag ID %d, rotation %f (degrees)" % (tag_name, tag_id, tag_rotation))
if currentTag == f'42, 16':
client.publish("ME35-24/kai", str(tag_rotation))
This is the main code logic for the camera which reads the AprilTag. It captures images from a camera and searches for AprilTags within each frame. When a specific tag (ID 42, family 16) is detected, it publishes the tag's rotation angle to the MQTT topic "ME35-24/kai.”
try:
recent_msg_float = float(recent_msg)
if 85 < recent_msg_float < 95:
if counter < 0:
stepper.right(0 - counter)
elif counter > 0:
stepper.left(counter)
counter = 0
elif 75 < recent_msg_float < 85:
if counter < -1:
stepper.right(abs(counter + 1))
elif counter > -1:
stepper.left(abs(counter - 1))
counter = -1
elif 95 < recent_msg_float < 105:
if counter < 1:
stepper.right(1 - counter)
elif counter > -1:
stepper.left(abs(counter - 1))
counter = 1
except ValueError:
if recent_msg == 'start':
left_motor.forward()
elif recent_msg == 'stop':
left_motor.stop()
This code snippet is the main logic for the MQTT commands and the steering. It adjusts the direction and steps of a stepper motor based on the value of recent_msg. It ensures the motor moves within specific ranges of rotation and handles special commands like "start" and "stop" to control a left motor's operation, which are sent by the machine learning model.