Week 9

This week I finished the program. And by finish I mean I got the program to a state where it will follow the line, and complete all actions it is given. As it currently stands, there ARE actions added for Blue, Green and Red, the 3 colours featured on the track.

However, they aren’t without problems. The current program makes the robot pick up an object (a crude cardboard square) upon detection of Blue. The problem with this function is that the robot’s arm has a tendency to get stuck (due to the design and structure of the robot). When it gets stuck, the robot stops dead until the arm either falls off, or the program is stopped and restarted. I am currently looking for a way for this to be fixed. The function is presented below.


It is initialized via the boolean variable, which is set to true via the main function upon detection of blue. The reason for having the function in a separate instance is so that their can be a delay which prevents the function from being initialized in the first 3 seconds, or within 13 seconds of any previous initialization thereafter.

Upon detection of Green (which due to the current track design, is after Blue) the robot turns, drives off the track, drops the cardboard square it is carrying, and then reverses, turns around, and finds it’s way back to the line. The problem here is that based on the robots movement (zig-zagging across the right edge of the line), the robot sometimes misses the green circle. This can hopefully be remedied by adding some green tape over the black part around the circle. There isn’t really any need to show the program for this, as it is just a string of simple motor command blocks.

The Red circle is right at the end of the track. The robot counts how many times it detects it as the number of laps. This value is then compared with the a value for maximum number of laps. This value for maximum number of laps is set at the beginning, where the user sends a value via bluetooth to the robot. If the user chooses to skip this with a button, the default value of 3 laps will be taken (this default value may need to be changed at a later date, as the robot takes a fairly long time to get around the whole track). The function is presented below:


This function is also part of a separate instance, again so that it can have a delay in place that prevents it being initialized within 3 seconds of being initialized previously.

In addition to the colour functions, I added a small segment of code that uses the IR sensor. This code runs parallel to the main code, and upon detection of an object (say the user’s hand) at close range, stops both large motors. This comes in handy when the user needs to stop the program or move the robot mid-execution, as they can simply hold their hand in front of it and the robot will stop. This IR function is presented below:


Finally, here is a demonstration of the Blue and Green functions combined for the final game:

The group member who rarely turns up actually made an appearance on the Friday session this week, however he seemed more intent on sleeping than on doing any work. Two of the other members did not show up, and did not give any notice to the group about their absence.

Week 9

Week 8

Beginning with a review of my request to my group at the end of last week:

None of my team even looked at the program. None of them know how the current program works, nor even know how the software works. I have come to the conclusion that I will have to complete this program alone, otherwise it will remain incomplete.

Furthermore, no progress has been made with the videos. We have no footage of the current program. Whilst I could also take the time to film the robot myself, I feel that my time on the project is better spent on the program. One group member persists in evading all contact and all group sessions. Yih-Ling has spoken to our group with regards to the individual, although that does not aid us (me) in completing the current project with one less person.

In the first session this week, I had a breakthrough. I managed to create a program that made the robot follow the line using the colour measurements instead of reflected light intensity. This paved the way for the rest of the program, and I was able to quickly implement a way for the robot to detect other colours whilst continuing to follow the line smoothly.

I created a flowchart as a plan for the game, detailing roughly what each colour function would do, and roughly how the program would execute each function.


One of the group members created a track over the week, which we were to use in the presentation of the program (as detailed in the brief for this week). Whilst this was good news, the bad news came in that the robot failed to execute the program properly on the day of the presentation. I had to explain to the guy running the session that the robot didn’t seem to be working, and requested that they skipped us. Whilst the other groups went ahead and demonstrate their robots and gained valuable feedback on their games, I sat and attempted to fix the program, and my group, the wonderful help that they are, sat and watched me for an hour. Two of them even went to the other side of the desk and talked about Match of the Day. After an hour I still couldn’t work out why the program that had been working not a day before hand, had suddenly stopped working, and so I asked my group to view the program, and I went and watched the other groups show off their robots. When I returned at the end of the session, the program was how I left it, and they were still talking about Match of the Day.

Week 8

Week 7

The brief for this week stated we needed to implement bluetooth into our program. I began by looking over the program to see where would benefit from bluetooth integration. I reprogrammed the calibration block in order to accept boolean values instead of button presses for the inputs.


The current limitations of the software means that the loops are required in order to force the program to wait for the bluetooth values to be recieved.

I then decided upon adding a bluetooth function to the main program which allowed the user to stop/start the program via bluetooth boolean values. I asked my team if they could think of any other applications for bluetooth in the current program, but received no suggestions in return.


As of now, I cannot work out any way to implement colour detection (as required in our bare-bones game plan) without breaking or re-writing the current program. I have asked other members of my team to look at the program themselves over the weekend and the coming week in the hopes that one of them can come up with a better way.

Week 7

Week 6

I began week 6 by planning different ways that I could make a robot follow a line. The most obvious was to have the light sensor measure colour, and steer different directions based upon which colour it was detecting (Black or White). However, after some experimentation, I could not find a way in which this worked successfully. My second idea was to use the built-in ‘Reflected Light Intensity’ measurements that could be made by the robot. So I began creating a program utilizing this idea. By myself, as usual (despite asking for some of the others to help me numerous times).

The problem I encountered straight away was that the measurements taken seemed to range wildly. The returned value on the white paper seemed to vary between 40 when sensor was in the robot’s shadow, and 70 when the robot was in direct light. The value on the black line was consistently around 8 or so. The value on the edge of the line (the important measurement) varied between 20 and 40 (again based on whether it was in the robots shadow or not). The basic idea for the program was to make the robot turn one direction (say right) when the light intensity matched white, and the other direction (left) when the robot’s reading matched black. When the reading was that of the edge of the line, the robot would continue in a straight line. Having no idea how to implement this smoothly, I looked for similar programs on online forums. One idea (from this video) was to connect the readings directly to the speed of each wheel motor. However, this required some modification of values, as the current values meant that the robot drove way to quickly for it to measure consistently. So following the program demonstrated in the video, I added the required mathematical processes (inverting and halving the values to accommodate for our robots motors (which due to the design, are backwards)) and the robot now seems to follow the line. The first iteration of this code is pictured below.LineFollowComplex

The custom block at the start is a calibration module which calibrates the light sensors values to be 0 on black, and ~100 on white. Unfortunately, this needs to be done every time the program is run. The Calibration program is pictured below.



Week 6