Calder Valley Brickshow Scarf Update #1

Yesterday I showed several of my robots at the Calder Valley Brick Show, during which my Weav3r loom was making a scarf. This time around I have decided that I shall be auctioning the scarf, via eBay, and the proceeds going to the charity supported by the show: Forget Me Not Children’s Hospice

The loom didn’t quite finish the scarf during the event. As can be seen from the photos, above, there is still a little warp yarn left on the bobbins. I shall complete the scarf over the next couple of days and post an update with photos of the scarf and a link to the eBay auction then.

Braill3 – An EV3-based Braille Bricks Reader

Several months ago I had been thinking on building a robot that could read the Braille bricks developed by LEGO. They have had them available in a more educational context via their https://legobraillebricks.com/ website for some time now. Unfortunately, these ideas came to nought at that time, so I put the idea to bed.

Roll on a few months and the group of RobotMak3rs that I’m a member of had one of their regular remix challenges – i.e. to mix an existing set with some robotics to come up with a new idea. By then the domestic LEGO Braille Bricks set (40656 in the UK) was out, so I figured it was time to revisit my earlier ideas.

Initial Ideas

Right at the start I wanted the bot to read out the Braille using text-to-speech (TTS). I wanted to do this as the bricks are intended for visually impaired users so having just display of text would be inappropriate. The Spike/Robot Inventor doesn’t have the ability to generate complex sounds on the hub itself and would have relied on an external app running on a mobile or table to perform that task. Instead I decided it would be far better to use an EV3 running Pybricks micropython as that has the ability to perform TTS output. In addition to wanting the Braille read out, I wanted all the prompts that would appear on screen to have an audio equivalent.

My initial idea for the mechanics was to have three medium EV3 motors each with 3×5 L-beam attached. As the bot moved along the line of Braille it would rotate the motors such that the tip of an L-beam touched either the brick’s stud or the top of the brick. The difference in angle of the motor would indicate a dot or not. However, very quickly this idea was discarded due to the fact that the stud height is only 1.7mm. The height, and therefore angle change was not sufficient to accurately distinguish the presence of a stud or not. Also this would have required three motors, only allowing one remaining to move the bot along a row. Since I wanted to have it be able to read multiple rows of text, I’d have needed 5 motors (X, Y, +3 for touch) which is not possible with an EV3. So I discarded this approach.

My next idea was for a bot that had an arm with three touch switches mounted on it and have the arm lift up and down. This way the angle of the motor would be irrelevant. The arm would need to move up and down on to each column of studs so that it wouldn’t get snagged on the next column as it moved sideways.

I went through various arrangements of the switches, settling on something similar to below for a number of prototypes:

The principle here was that the stud would push against the pin, which in turn via rotation of the 3×3 quarter circle beam would press in the button. The motors would have had to be mounted at 90° to each other due to the width of the switches (initially) preventing them being next to each other. The big problem with all of these designs is that the springs in the switches are remarkably firm. The motor, pushing the arm down, would have to apply a significant force – akin to trying to hold out a 1kg mass at arms length. Also, it looked ugly. I tend to work on the principle that if it’s ugly then it’s likely to be wrong.

The ‘Fingertip’

After going through several iterations of the ideas above, I had a brainwave. It was possible to mount the motors in parallel and ‘dog-leg’ the pins such that they could also touch the studs. To counter the issue of the required force to press in the switches, linear actuators would be used instead. Although this would slow down the sensing action it would trade speed against accuracy. I ended up with the mechanism below:

This mechanism worked perfectly, with an unexpected discovery on the switches, discussed further on.

Bridge, Switches, and Calibration

The Braille sensing mechanism (the ‘lift’ as I think of it) needed to move in both X axis and Y axis, so that the several rows of bricks could be placed on the baseboards supplied with the kit. The lift would be mounted on a bridge, allowing for Y-axis movement, and the bridge itself would move in the X-axis. The bridge took a few attempts to get right. Due to a combination of the mass of the lift and the force required to press in the switches resulted in flexing of the bridge, so this required a few revisions to get it rigid enough but not too bulky

One thing I had never realised about the EV3’s switches is that they trigger at the start of their travel, i.e. they don’t need to be pushed all the way in to trigger. Had they needed to be depressed all the way, it’s quite possible this model would never have worked. Due to LEGO being plastic, the parts are never perfectly aligned. This could have meant that one of the switches may have reached the end of its travel before either/both of the other switches had triggered. No amount of extra downward force could have pressed the other two as this switch would have blocked any more movement. Thankfully they trigger at the start, so it’s still possible to push down, thus enabling the neighbouring switches to also trigger.

Due to slight flex in the model, it’s not possible to have the motor wind the linear actuators to the same place per row of Braille. The middle two rows can require a little more force. To solve this the bot requires calibration on first use, and offers to calibrate on start up as well. Calibration requires that an L (⠇) brick is placed at the start of each row, then the bot tests each of those bricks. For each row the motor position for the last switch to activate is stored on disk, for repeat use, then when in use it drives the motor to just beyond the motor angle to ensure that all switches could be activated.

Accessibility

As I said at the start, I wanted this model to be accessible to the target users, so all instructions are read out as well as displayed on screen. All button operations have a small click to provide audio feedback, along with a relevant audio prompt, e.g. saying which row has been selected to read. To aid in placing the Braille bricks on the baseboard there are tiles on the first column. Subsequent bricks are simply placed next to the previous bricks. The EV3 has been oriented so that the speaker is facing the user so that it can be heard.

Video

As part of our group’s remix challenge we have to produce a video relating to our build. I opted to have the video show parts of the robot and it in operation. Since the bot converts the Braille to speech, I figured I’d have the voice-over performed by the bot as well (I’m never a fan of speaking on camera, and being a Brit I always feel that I sound sarcastic 😆). I also thought that it would be a fun little feature to have the subtitles show in Braille first and the wipe over to the actual text. The resulting video is below:

Build Instructions and Code

Build instructions: http://jander.me.uk/LEGO/resources/Braill3-BIs.pdf

Python code: http://jander.me.uk/LEGO/resources/braille.zip

The Python code needs to be run under Pybricks micropython, under ev3dev. This requires a microSD card. The official LEGO ev3dev installation can be found at: https://education.lego.com/en-us/product-resources/mindstorms-ev3/teacher-resources/python-for-ev3/

Operating Instructions

  • On first ever run of the program, it will pre-generate all the pre-coded speech prompts. This is so that the program doesn’t have to perform the TTS step every time a common speech prompt is needed. It will only do this the once.
  • On first run of the program it will insist on a calibration step. It will offer to perform a calibration on every other run, but it defaults to ‘no’. To calibrate perform the following:
    1. Put an L brick at the start of each of the 4 rows
    2. Press the centre button. This will then test each of the sensors and motor and store for future use
  • Lay out the Braille as wanted. There are 4 rows that can be used. Select the row to be read with the left and right buttons, centre to read. Spaces between words can be either one or two columns of studs wide. Three or more empty columns will end the row of text.

Robot Olympics Remix: Eque5trian – Show Jumping

In March 2021 the members of the RobotMak3rs RLOC were challenged to come up with ideas for an Olympic Games remix involving two kits, one robotic and one non-robotic. My idea was to mix the LEGO White House (21054) kit:

and the Robot Inventor (51515) kit (plus four pulley wheels):

to create an Olympic Show Jumping model:

Model Modes

The model has two modes of operation: automaton and game. In automaton mode, the horse will go around the arena leaping over the jumps of its own accord. In game mode, there are speed and jump controls and the player attempts to get the horse’s speed right and jump at the correct time. A video showing the model in operation and some of the details about it is below:

Model Operation

What was not shown in the video is how to start the model off. The model needs to set its initial positions. It does this by rotating the jump motor to its 0° position, then repeatedly rotating the horse motor to its 0° position until the horse ends up in its start position. This is required due to the 36T:60T gearing used to drive the horse’s position. The colour sensor will detect the white 3L beam when it’s in the start position.

The steps to start the model off are:

  1. Before starting the code, remove all the jumps – so that they don’t get knocked over during initialising.
  2. Start the program. The horse will go through its location detection steps.
  3. Replace all the jumps back to their positions
  4. Tap the hub – it waits for a tap gesture before carrying on
  5. Enjoy the model’s action.

Build Instructions and Code

I have produced build instructions which are linked to below. There are two programs, one for each mode that are also available below. The code has been written in the default block language for the 51515 hub. My BIs and code are released under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International licence:

Other Model Submissions

Other models from both the Robotmak3rs and the general community can be found at: https://www.facebook.com/hashtag/robotmak3rsolympics

Weav3r Loom Update / Photoshoot

I’ve been working on the loom for some time now, trying to iron out some of the issues with it. The main issues were down to the pattern setter, the ‘Jacquard’, getting jammed. This was mainly down to two problems: 1) uncontrolled tension in the warp threads, and 2) vibrations during weaving causing the floating pins to move out of place. To explain these issues, I built a demonstration model.

Demonstration

When I’ve shown the loom off at exhibitions, I’ve struggled to explain, easily, how the pattern setting mechanism works. So, to make it simpler I built a little model of the mechanism, as shown in the picture below. There are two beams with floating pins inside. The picture below shows the front beam with 3/5 of the pins set forward. The beam behind the vertical yellow axles, the “heddles”, can lift upward.

Demonstrator during pattern set.

The picture below shows the lifting beam in the upmost location, thus showing how the patterns are set on the heddles.

Demonstrator during “weaving”

Both of the pictures above also show where the problems with the loom occur.

Problem #1

The first problem is when setting the pattern. The loom needs the heddles to be in their lowest position for the pins to move back and forth. In the original build of the loom, there was only one way of maintaining tension in the warp threads which was by applying friction to the warp thread supply drum. Due to how the loom was loaded with threads, and this drum wound, it was probable that the tensions would be mismatched across the threads. As the loom worked though row by row, some threads could end up much higher tension than the others. This would cause issues when the setter beam needed to drop as it couldn’t pull down against that tension, resulting in the setter beam being slightly crooked. When the pin setter tried to move the pins, they would not line up with the rear bar’s holes resulting in the entire mechanism jamming up.

Problem #2

The second picture shows how  the vibrations could be a problem. As the shuttle would pass from side to side there would be a small amount of vibration, or buzz, from the motors. This would cause the unsecured pins on the front beam to precess backwards, under the heddles that had been lifted. Again, when the bar was lowered it would not be able to move to its lowest position due to these precessed pins blocking its path.

Solution #1

The solution to problem #1 was to build a complete replacement to the original thread tensioner. This time a set of 8 driven pinch rollers was implemented, along with a gearbox to control the warp thread supply drum.

Warp thread pinch rollers

The picture above shows the new pinch rollers and the warp supply drum. There is a gearbox, shown in the picture below, that connects the drive of the pinch rollers to the warp supply drum.

Warp thread supply drum gearbox

When the gearbox is selected in one direction the supply drum rotates very slightly faster than the pinch rollers, with a white clutch gear for protection. This allows for much faster loading of the loom. The threads can be placed in the rollers and clamped down, and then tied to the drum. The loom can then be set to wind the threads up on the drum, with all threads at equal tension. Moving the gear to the other selection connects the drum to a gear on a blue axle pin. This simply applies some friction to prevent the drum from spinning uncontrolled as the loom weaves. Neutral allows for rapid removal of threads when needed.

Warp pinch lower rollers

The lower pinch rollers, as shown above, are all driven by a large EV3 motor. When weaving, just before the heddles are lowered, these rollers will move the thread forward by one weft thickness (adjustable during operation), thus lowering the tension in the loom. Tension is restored when the cloth is wound forward on to the take-up drum at the front.

Loading warp threads

Loading the threads is simple – 4 threads per pinch roller, and then clamped in place with the top rollers as shown below.

Warp threads clamped in the pinch rollers

Solution #2

Various solutions to the vibration problem were tried, most involving attempts to increase the sliding friction on the pins. None of them worked; no solution to applying just a little friction in a LEGO-only manner could be found. Instead, the simple solution of tilting the loom slightly forward was tried. This would mean that if the pins were to precess, that they would move forward rather than backward due to the slight slope. Raising the back of the loom up by 1.5M over a distance of 48M is just about 2°, so not a massive tilt, but just enough. The beams running under the loom, as shown below, give that tilt.

Beams running under the loom to tilt it.

Photoshoot

Whilst I was taking pics of various parts, I decided to take (hopefully) some good pics of the rest of it whilst I was at it. I include those below.

Some of the pattern pins, as seen from the front.

The grey racking at the front of the picture above has a partner rack at the back. This is used by the pattern setter to move left and right.

The cloth drum width sensor.

The white and blue flap above is connected to the medium motor. After each pass of the shuttle, the motor lifts the flap, and then lowers it gently until it stalls. The motor angle is then used to compute how wide the cloth take-up drum has become. That in turn is then used to compute the angle of rotation needed to ensure a consistent movement of the cloth, i.e. the width of the weft. This should be the same distance as the pinch rollers will move. The thickness of the weft can be adjusted during operation, along with minor adjustments to the drum and pinch roller positions.

Some of the heddles set, and the reed outward.

Part of the rear heddle beam lift mechanism.

The picture above shows part of the rear beam lift mechanism. There is an identical rack on the righthand side of the loom, along with lifting racks at the ends of the beam.

Underneath showing the Jacquard pins, and the setter.

The light grey structure is the bottom of the pin setting mechanism. This moves left and right, under the pins, pushing the pins back and forth. It is programmed to set the pins in both directions, rather than returning to the ‘start’ on each shuttle pass.

Loom front.

Loom rear.

The shuttle loaded with thread.

The “Brain” of the system.

The entire loom is controlled by the EV3 shown above. Bluetooth is used to coordinate the actions of the other two EV3s. One EV3 manages the heddles setter (2 motors), heddle lifter (1 motor) and the reed (1 motor). The other EV3 manages the shuttle (1 motor), pinch rollers (1 motor), cloth wind drum (1 motor) and drum sensor (1 motor).

Pattern board and caddy.

The “Brain” in its scanning mode.

The pattern boards can be used to set up what to weave. They are 16×32 boards, with the pattern woven across the middle 16 warp threads, using 32 wefts. A plain “up, down, up, down” pattern will be woven into the 8 threads either side of the board’s pattern to ensure that the cloth remains in shape.

I’m hoping to take some videos of the loom in action next week and will post them to YouTube.

Further AI2/EV3 Bluetooth Coding

As my recent posts, Receiving BT Mailboxes from EV3 by an AI2 App & Updated BT comms between EV3 and AI2, have indicated, I’ve been working on getting AI2 app sending and receiving EV3 Mailbox messages via Bluetooth.

One of the annoyances I’d had is that the EV3 needs to know the name of the AI2 device in order to send messages to it. Up until now that name has had to be entered in to the AI2 app by the user. This of course is prone to error in different ways. There wasn’t a way of getting this information via the available AI2 components. So, I’ve added to my EV3Mailbox extension. It now has a GetDeviceName call which will extract that information to be used however one wishes. In my code I use this name to send a message to the EV3 on connection, for it to use in return messages. Example AI2 code is as below:

AI2 BT Test Code

Downloads

As usual, I am making my code available for use under the Creative Commons Attribution-ShareAlike 4.0 International License

The sample .aia file should include the extension .aix file, but I am also making that specifically available for use in other people’s projects.

  1. Run the EV3 code first.
  2. Start the AI2 app.
  3. Press “Connect EV3”. This will give the list of BT devices known the the Android device. Choose the correct EV3.
  4. Once the App is connected to the EV3, the EV3 will say “detected”.
  5. Pressing the top, middle, or bottom buttons on the EV3 will send a BT message to the App:
    • A string for the top button
    • A number for the middle button
    • A boolean for the bottom button.
  6. The App will then display the message name and contents in the top two boxes.
  7. You can send string, number or boolean values back to the EV3 via the relevant boxes and buttons.

Updated BT comms between EV3 and AI2

Recently I posted about sending Bluetooth messages from and EV3 to an AI2 app. I decided at the time not to bother considering handling receiving IEEE754 floats as strings would work. Since then I have been thinking about how cluttered the code was, and that AI2 doesn’t easily support libraries of AI2 code. So, I started investigating writing a simple EV3Mailbox extension for AI2. After a bit of learning Java (I’m a Perl programmer at heart) I now have an extension:

EV3Mailbox Extension

The extension is still quite simple, solely handling the packing and unpacking of the message bytes. The Bluetooth comms part will still need to be performed by AI2 code. An example of using this to send/receive messages is as below:

Sample send/recv AI2 code.

Downloads

As usual, I am making my code available for use under the Creative Commons Attribution-ShareAlike 4.0 International License

The sample .aia file should include the extension .aix file, but I am also making that specifically available for use in other people’s projects.

  1. Run the EV3 code first.
  2. Start the AI2 app.
  3. Long-press the BT button. This will bring up a settings box.
  4. Enter the App device’s Bluetooth name in the settings – and press ‘save’. This will be stored, and sent to the EV3. This is so that the EV3 knows where to send its messages to. This must be the same name as shown in the BT connections list on the EV3.
  5. Press “Connect EV3”. This will give the list of BT devices known the the Android device. Choose the correct EV3.
  6. Once the App is connected to the EV3, the EV3 will say “detected”.
  7. Pressing the top, middle, or bottom buttons on the EV3 will send a BT message to the App:
    • A string for the top button
    • A number for the middle button
    • A boolean for the bottom button.
  8. The App will then display the message name and contents in the top two boxes.
  9. You can send string, number or boolean values back to the EV3 via the relevant boxes and buttons.

Receiving BT Mailboxes from EV3 by an AI2 App

Recently, on the MINDSTORMS Facebook group, the question was posed about is it possible to receive Bluetooth mailbox messages in an AI2 app from an EV3. This is something I’ve been meaning to do for a while. I’d written AI2 code to send BT messages to an EV3, but hadn’t focused on receiving messages. This was the spur to actually get this code written.

It wasn’t too tricky. Receiving the message was simple, but parsing it was the harder part. The format of a message from the EV3, as covered in my update to BT messaging is:

MLenL, MLenH, 0x01, 0x00, 0x81, 0x9E, NLen, NameBytes, 0x00, TLenL, TLenH, TextBytes, 0x00

This is received as a string of bytes, so has to be parsed as a list. Add in that there doesn’t appear to be a chr(x) type function in AI2 to convert from a number to its equivalent ASCII character, I have to do some array/list lookups. Thankfully I had that code in place for sending to the EV3.

Code

I’ve got the code to a position that it can hopefully be used for other purposes. I’m releasing what I’ve done so far as a baseline for others to work from. To make the code, linked below, work do the following:

  1. Run the EV3 code first.
  2. Start the AI2 app.
  3. Long-press the BT button. This will bring up a settings box.
  4. Enter the App device’s Bluetooth name in the settings. This will be stored, and sent to the EV3. This is so that the EV3 knows where to send its messages to. This must be the same name as shown in the BT connections list on the EV3.
  5. Press “Connect EV3”. This will give the list of BT devices known the the Android device. Choose the correct EV3.
  6. Once the App is connected to the EV3, the EV3 will say “detected”.
  7. Pressing the top, middle, or bottom buttons on the EV3 will send a BT message to the App.
  8. The App will then display the message name in the top text box, and the message text contents in the bottom box.

The code only handles text message. I have no plans to develop code to handle numbers or booleans. The EV3 code will coerce those data types to strings if sent as a text message. AI2 will coerce strings to numbers if they look correct, so if the EV3 needs to send a number, simply send it as a string, and the AI2 App will still do the Right Thing™.

Images and Code

AI2 Code Image

The AI2 code and Ev3 code may be obtained from:

The code is released under the Creative Commons Attribution-ShareAlike 4.0 International License

Producing Complex Structures on the EV3DPrinter

So, for the past few weeks I’ve been working on my build of the EV3DPrinter. I’ve had it printing various geometric shapes, “College” letters, and more recently I’ve been working on a complex shape – namely a castle:

A Small Castle
A Small Castle

Development

Since I’ve been programming my instance of the EV3DPrinter in EV3G I couldn’t realistically use G-Code as the string handling in EV3G isn’t up to the task. Instead I would do it some other way.

For years I’ve been using a very old X11 drawing package Tgif; this has the great advantage that its file format is text based, and object oriented – a perfect source of drawing data. I used this package to draw out my letters:

Letters in Tgif
Letters in Tgif

This worked perfectly as it was easy to convert the polygon path of each letter in to a simple set of coordinates for use in my code on the printer. I was however accidentally fortunate in that I’d aligned the letters to a grid, so could work out which one was which based on rows and then position along that row.

When it came to making the castle I figured I’d use the same program to develop each layer, and just write a new Perl program to parse the file into plot data. It was rather more complex than I’d thought. I had to design each and every layer of PLA, bar those layers that repeated, i.e. the first 5. So, first, I started off with just the base of the windows:

Base of the castle’s windows

This took two goes as the first time I had the slope at the bottom at too shallow an angle, so they sagged badly. Next I worked my way up the walls:

Working out the walls

Eventually after some trial I got to the final castle.

Process – First Version

The luck I’d had with the letters couldn’t work with a complex object. I needed to be able to define a layer’s polygon, its position relative to the layer below and some form of order. The simple answer, for me, in Tgif was to have a bounding box, a text object with a number, and a polygon inside the bounding box, as below:

Layer 27 of the castle

Layer 27, above, is closing up the tops of all the windows and the doorway, along with producing buttresses for the crenelations at the front. So here, there is the box, the number “27” and a polygon – the just visible arrow on the inside bottom left defines the end point of the path for that layer. The line width of the polygon defines how many layers are to be repeated for this path – 1 in this case.

This was a time consuming process, which did work, but resulted in 42 layers. The final Tgif image looked as below:

Final Tgif file for the castle

Process – Second Version

Although the Tgif images, so far worked well, it was a lot of effort. Discussing with a colleague, he asked why I couldn’t define a start and an end and layer-by-layer go from one to the other. A bit of thinking and a new plan came to mind.

The same system of box, text, and polygon would be used by a way of linking a start and an end would be used – a simple dotted box would group two layers together. The layer numbers would define the start and end, and all layers between would be interpolated between the two polygons. Of course both polygons would need the same number of points, but that’s fine.

Some new coding, along with spotting a mistake in my first castle, and I get the new Tgif image of:

13 layers to define the castle

Close up on layers 22 – 27

This is so much easier to manage. I’m now thinking on what to make next. I’ll be writing another program that will be able to read in multiple complex models and give a menu to select what to make. Once that’s done, I think I’ll be ready to publish my code – watch this space.

New Plott3r Pen Holder

Whilst preparing for Bricktastic I developed a new, and hopefully better, pen holder for my Plott3r. This holder is a little heavier than the original, and somewhat more solid. This ought to remove some of the small erratic movements from the plots.The PDF for the building instructions can be found at:

Plott3r-PenHolder-v2

The LDR file if you wish to load it in to an LDraw package is as below:

Plott3r Pen Holder LDR file

It should be noted that the pen holder will need to be built on to the tracked section, it cannot be clipped on like the original. Build the base, attach it to the tracks, then clip the front and back sections on to it.