Week 5

After much deliberation over the state of the program, I made the executive decision to change the basis of the program. The current program’s reliance on the IR sensor made it unreliable; the IR sensor often returned a variation of different values when measuring it’s proximity to an object, even when it hadn’t moved since it’s previous measurement. This meant that the robot fairly regularly failed to run, and during the execution of the program, simply drove into the wall (and due to the tall, top heavy design of the robot, fell over).

This issue, combined with the fact that the robot had no way of making precise 90 degree turns, meant the current plan and robot would not work together. What’s more, we had jury-rigged the light sensor onto one of the robot’s legs. This meant parts of the robots tracks and cables interfered with the reading often, and the sensor itself was decentralized. As a culmination of all these problems, we made the decision to rebuild the robot into a more usable design; one that positioned the light sensor more fittingly. The new design is the EV3MEG design. This design still allows us access to the IR sensor, but gives us a more stable and usable light sensor armature and has movable arms.

In other news, no more progress has been made on the youtube videos, and the group member who rarely ever shows up, didn’t show all week.

Week 5

Week 4

At the start of Week 4, we began by allocating rough “roles” to group members. Without naming any names, 2 members were assigned to filming/editing the youtube videos, 2 members were assigned to develop upon the idea for the game, and me and another member were too improve upon the current program (i.e make it simpler, more efficient, etc.). Upon completion of these tasks, alongside the tasks set on moodle, we would then reconvene in order to create the next part of the program. Or at least that was the plan.

I began by testing the program from the week before. I tested the colour sensor with a simple coloured piece of paper which the robot would drive over and proceed to follow it’s designated instructions. The test of this can be seen in the second section of the following video:

The next part of the program was the maze navigation via the IR sensor. The general idea was that if the robot detected a wall, it would rotate to the right by 90 degrees (as demonstrated in the video above). If it detected another wall immediately after, it would then rotate 180 degrees therefore going left from it’s original starting direction. This worked fine in concept, however the EV3 software is extremely limiting and so there was no easy way to make the robot turn exactly 90 degrees. The robot could either turn based on wheel rotations, motor degrees, or for a set time period & set speed. This meant to force it to turn exactly 90 degrees, I would have to measure the exact number of wheel rotations, motor degrees, or seconds in which it took to turn 90 degrees. I did a few tests, modified the number a little, and eventually got a rough estimate. I simplified the program for the test, removing any extra functions so that we could ensure that the IR sensor’s program worked as intended. The test program can be seen below.IRcode

At the end of the week, I checked in to see how much the others had done. The team assigned to creating the youtube videos (one of whom didn’t show up to either session this week) seemed more content on adding music to the current videos and creating “quirky” names for the upcoming videos rather than actually filming anything useful. The planning team had drawn up a plan for a maze, which was useful, although based of the size of the robot, the maze would be about 10ftx20ft. I had to remind them we probably only had about a quarter of that at most.

Week 4

Week 3


The first task for week 3 was to produce a flow chart for our game, to aid us in programming the game within the software. We began first with making a simple flowchart detailing all the processes which the robot would need to go through in order to complete the game, however we did not include individual programming paths for each process; instead I plan to split each process up into it’s own flowchart, and treat it as it’s own program which can be run separate from the main program. The hope is that this will make each process easier to program and potentially easier to implement, although it may cause complications when attempting to assemble the entire game as a whole program.

I began by creating a flowchart for a simple program which would utilize the robot’s colour sensor. The program would make the robot move in the absence of colour being detected, but upon detecting blue beneath the colour sensor, it would fire it’s weapon, or upon detecting red it would stop. I also wanted the program to be able to be rerun by a press of the touch sensor, as the current system meant that the program would need to be re-selected on the main brick display, and being able to press the touch sensor allowed a simplified method of input.

ColourFiringProgram1

The image above depicts the first, simplified iteration of the process. It is design so that it functions as a single ‘module’, in the sense it can be copied and pasted into other programs and should function the same as it functions on it’s own. We uploaded the program to the robot, and tested it. It worked as intended, however there was a small issue in which the colour sensor would not detect other colours (i.e Red) and run their function whilst running the function following when it detects another colour (i.e Blue). This led to the robot simply skipping colours if it had already passed them whilst executing the functions following another detection of another colour. We recorded a video of the test, and it is to be uploaded to the youtube channel.

In the second session, I expanded upon the program, adding in an operation allowing the robot to count the shot’s it has fired. This is to be used to count the remaining ammunition. I also added a function in which when the touch sensor is pressed while the program is running, the program stops, and if it is out of ammunition, the program also stops. This second case is temporary, as there is no currently designated path or area for the robot to take once it has run out of ammunition, but the path is there which can be added on to.

ColourFiringProgram2

Above is the second iteration of the program. Due to a team member’s absence, I have been unable to test this program as I have not had access to the robot.


Week 3

Week 2


We began by modifying the robot’s design to more suit our needs; we moved the colour sensor from the ‘shoulder’ area of the robot down to the leg. We did this to allow the robot to read colour prompts that are laid out on the floor of the maze. We also removed several aesthetic pieces in order to make the main brick of the robot more accessible. With the modifications to the robot complete, we were then able to begin learning how the software controls the robot, and start planning out our game based around the modules available inside the software. However, we may be required to modify the robot in the future.

I began learning about the Lego Mindstorms development environment by reading through the EV3 Help page development guides for detailed tutorials on how to use the basic modules to operate the robot’s sensor’s and display. Alongside these modules were modules for basic math and logic, most of which I understood through what I had learnt in the 121COM and 124MS modules.

I had hoped that other members of my team would make an effort to also learn the program, even just to get a better understanding of the limitations of the software. These hopes seem to have gone unanswered however as of the beginning of week 3 none of the other team members seem to have learnt any of the program. This is evident in their design plans, as several elements of the design to not appear at first to be possible given the constraints present when using the Lego software, as opposed to using an actual programming language.


Week 2

Week 1


After a brief introduction to the first ALL project: Lego Robot Game, we began by reading through the Assessment Specifications to get a good idea of the level of complexity our game required in order to achieve high marks.

The first Activity Brief detailed some of the requirements which would make the game complex. We discussed these, and formulated a basic plan for a game based around some of these requirements. The general basis of the game is that the Robot utilizes an Infrared sensor and a Colour sensor to navigate it’s way around a maze. Colour sensor cues would indicate targets and the robot would fire it’s weapon at them, keeping record of it’s ammunition count, and returning to a set point in order to collect more ammunition. With this in mind, we selected the EV3RSTORM Lego Mindstorms EV3 robot design with which to base our robot on. Upon receiving the Mindstorms Robot kit we proceeded to assemble the robot along with the EV3RSTORM design.


Week 1