Report – Evaluation and Reflection

Looking back at our game and viewing the final program, I believe that whilst the game does work as intended and matches the plan that was eventually laid out for it, it still lacks the qualities of what one would describe a game as being. By this I mean that there is very little user interaction; once the user has selected the number of laps, the only other user interaction is to replace the cardboard boxes which the robot picks up back onto the track for it to pick up on its next pass, which could be viewed as an oversight in the plan of the game. On top of this, the game lacks any real goal or final outcome. Once the robot has completed its designated number of laps, it just stops.

In regards to the process of planning the game, we mostly ended up coming up with ideas on the spot once we had ample working code for the current plan. This meant we built upon the idea of a line follower, adding functions as we went. Looking back this was clearly the wrong way to go about the design of a game. What should have happened was we should have spent more time planning out the entire game and all its objectives and functions before starting on the program.

As for academic skills, it is my opinion that the biggest lesson learnt was to never rely on others until you understand their motivation. Whilst my assumption at the start had been that all members of the course were attending university to work, I believe that this assumption was proved false throughout the duration of this project. Whilst delegating tasks to others works in theory, the structure of the project as a whole then relies on each person completing these tasks on time (or at all) for the project to be completed.

When it comes to professional practices, this project has demonstrated the importance of using all forms of media in order to present and document my work, and to use proper version control when creating complex programs.

Advertisements
Report – Evaluation and Reflection

Report – Wireless Communication/Bluetooth

The Bluetooth functions of the robot require a third-party application which is only available from the Google Play Store under the name ‘EV3 Mailbox Remote’. The application works by selecting a Bluetooth connection to the robot, inputting a matching mailbox title, and then sending either a String, Value or Boolean data type. For certain functions, the robot will only accept certain data types, and so will not change state if it receives a different type.

For our game, we decided to implement user interaction via Bluetooth. At the start of the game, the user can initialize the game via either a button on the front panel, or by sending a Boolean ‘True’ value to the robot. The robot then requests that the user input the number of laps via the screen on the front. The user can then send a value via Bluetooth to decide the number of laps. This number will be displayed on the screen throughout the duration of the game. The user can opt to skip this input via a button, after which the game will set the number of laps it is to complete to 3.

References:

Ferdinand Stueckler (2013) EV3 Mailbox Remote [online] available from <https://play.google.com/store/apps/details?id=com.EV3.Mailbox&hl=en> [2 December 2015]

Report – Wireless Communication/Bluetooth

Report – Components and Processing

EV3 Components

The LEGO® EV3 Home which we were supplied with contains 1 EV3 programmable brick, 1 Color sensor, 1 Touch sensor, 1 Remote Infrared Beacon, 2 Large Motors and 1 Medium Motor (Lego Mindstorms Product page n.d.).

The main EV3 brick has 4 RJ12 input ports for connecting the sensors, as well as 4 RJ12 output ports for connecting the motors, a USB host port for either chaining the brick to other bricks or for a Wi-Fi dongle. For transfer of data between computer and brick, there is a Mini USB port and a Micro SD Card port. The brick acts as a microcomputer, and allows you to control different motors/sensors individually as well as take readings from the sensors and display them on the screen. You can also control individual Wi-Fi connections and run programs that have been stored on the robot. There is also a high quality speaker on the back. It can be powered by either 6xAA batteries or a 2050mAh lithium rechargeable DC battery pack. The sensors are each powered and accessed via the RJ12 ports.

The brick runs on an ARM9 300MHz processor, and has 16MB Flash-RAM and 64MB RAM. The operating system is Linux based. (botbench EV3 review 2013)

Aside from the brick and sensors, the set also contains over 500 LEGO® Technic pieces with which to build the desired robot model, as well as sets of instructions for a few of the models. The rest of the models’ instructions can be accessed via the LEGO website.

References:

LEGO Corporation (n.d) Mindstorms EV3 Product Page [online] available from <http://www.lego.com/en-gb/mindstorms/products/31313-mindstorms-ev3> [30 November 2015]

Soldaat, X. (2013) Bot Bench: Comparing the NXT and EV3 bricks [online] available from <http://botbench.com/blog/2013/01/08/comparing-the-nxt-and-ev3-bricks/> [30 November 2015]

Report – Components and Processing

Week 10

So this week is the final week, meaning that by the end of it we needed a working demo video of the robot to show for. To get to this point, there were still some problems which prevented the robot finishing each lap, but after fixing those problems, I can now present the following 1-lap demo:

The major problems were ones which halted the program, or otherwise prevented the robot from completing the lap. The most prominent of these was that because of the way we stored the paper track (rolling it up) the track had to be pinned down, and between the pins along the edge the paper kept riding up. This was a problem because the robot’s wheels would catch on the edge of the paper and either cause it to stop, or cause it to veer unexpectedly, often causing it to go off track. Another issue was that due to the small space in which I had to record a demo, when the robot executed the Green function (turned off the track and dropped the box) the IR sensor function would trip from detection of the wall, causing the robot to stop. To bypass this I had to disable the IR function for the main demo. The other big problem was the dispensing of the cardboard box. The arm seemed to drop the cardboard box perfectly on any non-filmed test runs, but upon attempting to film the robot, the box would nearly always get stuck on the end of the arm. To fix this I simply upped the speed at which the arm lowers, giving the box added momentum as it was thrown. I also had to slightly decrease the angle in both the lifting and dropping functions to prevent the arm motor getting stuck as I described in the previous post.

As a comparison of the final game and the initial brief, the game scenario matches the following complexity requirements:

  • Movements
  • Pick-up and reposition of an object
  • Colour detection
  • Touch detection
  • Position change detection (IR)
  • Information display
  • Button Input
  • Line Following
  • Timing control for an activity
  • Detection of the position of an object based on it’s distance
  • Repetition of behaviours (loop)

A full image of the program’s code is available here.

Week 10

Week 9

This week I finished the program. And by finish I mean I got the program to a state where it will follow the line, and complete all actions it is given. As it currently stands, there ARE actions added for Blue, Green and Red, the 3 colours featured on the track.

However, they aren’t without problems. The current program makes the robot pick up an object (a crude cardboard square) upon detection of Blue. The problem with this function is that the robot’s arm has a tendency to get stuck (due to the design and structure of the robot). When it gets stuck, the robot stops dead until the arm either falls off, or the program is stopped and restarted. I am currently looking for a way for this to be fixed. The function is presented below.

BlueFunction

It is initialized via the boolean variable, which is set to true via the main function upon detection of blue. The reason for having the function in a separate instance is so that their can be a delay which prevents the function from being initialized in the first 3 seconds, or within 13 seconds of any previous initialization thereafter.

Upon detection of Green (which due to the current track design, is after Blue) the robot turns, drives off the track, drops the cardboard square it is carrying, and then reverses, turns around, and finds it’s way back to the line. The problem here is that based on the robots movement (zig-zagging across the right edge of the line), the robot sometimes misses the green circle. This can hopefully be remedied by adding some green tape over the black part around the circle. There isn’t really any need to show the program for this, as it is just a string of simple motor command blocks.

The Red circle is right at the end of the track. The robot counts how many times it detects it as the number of laps. This value is then compared with the a value for maximum number of laps. This value for maximum number of laps is set at the beginning, where the user sends a value via bluetooth to the robot. If the user chooses to skip this with a button, the default value of 3 laps will be taken (this default value may need to be changed at a later date, as the robot takes a fairly long time to get around the whole track). The function is presented below:

RedFunction

This function is also part of a separate instance, again so that it can have a delay in place that prevents it being initialized within 3 seconds of being initialized previously.

In addition to the colour functions, I added a small segment of code that uses the IR sensor. This code runs parallel to the main code, and upon detection of an object (say the user’s hand) at close range, stops both large motors. This comes in handy when the user needs to stop the program or move the robot mid-execution, as they can simply hold their hand in front of it and the robot will stop. This IR function is presented below:

IRFunctionAddition

Finally, here is a demonstration of the Blue and Green functions combined for the final game:

The group member who rarely turns up actually made an appearance on the Friday session this week, however he seemed more intent on sleeping than on doing any work. Two of the other members did not show up, and did not give any notice to the group about their absence.

Week 9

Week 8

Beginning with a review of my request to my group at the end of last week:

None of my team even looked at the program. None of them know how the current program works, nor even know how the software works. I have come to the conclusion that I will have to complete this program alone, otherwise it will remain incomplete.

Furthermore, no progress has been made with the videos. We have no footage of the current program. Whilst I could also take the time to film the robot myself, I feel that my time on the project is better spent on the program. One group member persists in evading all contact and all group sessions. Yih-Ling has spoken to our group with regards to the individual, although that does not aid us (me) in completing the current project with one less person.

In the first session this week, I had a breakthrough. I managed to create a program that made the robot follow the line using the colour measurements instead of reflected light intensity. This paved the way for the rest of the program, and I was able to quickly implement a way for the robot to detect other colours whilst continuing to follow the line smoothly.

I created a flowchart as a plan for the game, detailing roughly what each colour function would do, and roughly how the program would execute each function.

LineFollow

One of the group members created a track over the week, which we were to use in the presentation of the program (as detailed in the brief for this week). Whilst this was good news, the bad news came in that the robot failed to execute the program properly on the day of the presentation. I had to explain to the guy running the session that the robot didn’t seem to be working, and requested that they skipped us. Whilst the other groups went ahead and demonstrate their robots and gained valuable feedback on their games, I sat and attempted to fix the program, and my group, the wonderful help that they are, sat and watched me for an hour. Two of them even went to the other side of the desk and talked about Match of the Day. After an hour I still couldn’t work out why the program that had been working not a day before hand, had suddenly stopped working, and so I asked my group to view the program, and I went and watched the other groups show off their robots. When I returned at the end of the session, the program was how I left it, and they were still talking about Match of the Day.

Week 8

Week 7

The brief for this week stated we needed to implement bluetooth into our program. I began by looking over the program to see where would benefit from bluetooth integration. I reprogrammed the calibration block in order to accept boolean values instead of button presses for the inputs.

LSCalibrateBT

The current limitations of the software means that the loops are required in order to force the program to wait for the bluetooth values to be recieved.

I then decided upon adding a bluetooth function to the main program which allowed the user to stop/start the program via bluetooth boolean values. I asked my team if they could think of any other applications for bluetooth in the current program, but received no suggestions in return.

LineFollowComplexBT

As of now, I cannot work out any way to implement colour detection (as required in our bare-bones game plan) without breaking or re-writing the current program. I have asked other members of my team to look at the program themselves over the weekend and the coming week in the hopes that one of them can come up with a better way.

Week 7