The first day of the New Year features a full wolf supermoon, 1 when the Moon is closest to the earth so brighter and bigger. The Moon was lovely so I figured I’d try for a shot. the Independent tells you why it’s a Wolf moon.
The Moon disc itself is as bright as the beach on a summer’s day when you are taking a picture of it, because it’s in full sunlight, no clouds and about the same distance from the sun as the Earth. Should be a doddle – I got the Canon EF 100-400 lens that I cleaned up, put it on a monopod and aimed at the Moon. f/8 1/400 ISO200 go.
Turns out not to be as easy as that. I needed a tripod, switched off IS and even then not every shot was equally sharp, must find the remote cable for the Canon, maybe it’s mirror slap. Took the best, that’s the top picture. I then tried my Micro Four Thirds camera with a 100-300 lens – the MFT sensor is probably smaller than the APS-C sensor on my EOD450D so the 300 end is probably comparable with the 400 on the Canon Continue reading “Wolf Supermoon”
Wind is normally the enemy of sound recordists, but going through some recordings from last year I found this recording of ex-hurricane Ophelia from the 16th October 2017. Ophelia had been pretty nasty originally and was still bad when it got to Ireland.
I recorded it in Glastonbury in the south-west, by finding a sheltered spot and pointing the mic in a windshield at a bunch of trees, which made a good recording given the wind. The key was that I had good shelter at the mic, but the trees were exposed to the full force of the wind.
The storm dragged up a load of Saharan dust, making the sky the sickly yellow in the pic.
I went to an open day in October run by the mind people, makers of the Vilistus EEG interface. It was an opportunity to see this in action and ask questions – the day was £85 which wasn’t too bad, there were about five other people there. It was run in some anonymous hotel near a football ground in Birmingham just off the M6, and led by Stephen Clark who knew the product well.
It was an interesting day, the Vilistus 4 box is a digitising interface but analogue signal conditioning is done in the sensor boxes, which add some cost to the overall system. Their default software looks fine even for Mind Mirror since it seems to have the filter bank in it, the extra costs for the Mind mirror package probably involves extra training. You seem to get the vilistus pro software with the box. I haven’t seen any of the units come up on Ebay.
I learned that the interface between the Vilistus interface and the computer is OpenEEG P3, which was good to know, and Stephen did warn that a lot of the older OpenEEG code from the OpenEEG project made the assumption that there were only 6 active slots rather than following the protocol specification which allowed the source to say whether there are 6 or 8 slots of data. Vilistus use 8 slots, so code assuming 6 would barf.
He did say the existing API would allow the Vilistus Pro software to continually dump the values of the filter slots to a text file that could be read by a program to display the output on LEDs, obviously I would get to build the interface and write the program 😉
The Vilistus Pro software did show correlations well – most clearly on a display where they showed heart rate against a trigger for breathing in and out. The heart rate slows a teeny bit on breathing out relative to breathing in, although this effect fades with age – it was clear on the 25 year old student and not really visible on a 50-something lady on the course. EEG was tough to get going in the course, although it was demonstrated using disposable electrodes on the forehead. This isn’t the optimal placement for Mind mirror but you can’t use disposable electrodes on areas of the scalp covered by hair.
The trouble is this rig would be about £1200 all in, and I’m not yet sure I am £1200 interested in the Mind Mirror. I did get a much better feel for using this in the field, and I’m aware that while I have been able to solve the digitising side of things using the PIC, I still need to solve the EEG diff amp, and solve the electrode problem.
Vilistus seem to have solved a lot of that, but even the electrode set is ~£200, so the bundle would be the way to go. One to mull over really, to work out whether I want the functionality or the engineering challenge. I could probably knock off £500 going DIY if the development went OK, but experience shows only one to two PCB fails or wrong turns can wipe out the savings on a one-off project where there’s a COTS solution.
We are using 160901 to make compost tea. Although the temperature has fallen to ambient, it’s still a bit early. This is only seven weeks old, and it’s apparent that while all the green material and plant material has gone and isn’t recognisable for what it is, the woodchip takes longer to break down. As such it will be mainly bacterial, the fungi take longer to develop. Fungi are better at decomposing woody material. But sometimes it is not worth letting the perfect be the enemy of the good.
Looking back at the success we had with the beans there is some latency of a few months between putting the compost organisms out, compared with the more modest results after only a couple of months, so we want to get this out now to do its work over the winter. Continue reading “Making compost tea”
The Fonnereau Way has been used since the mid-1800s, although it’s been the subject of a fight when a incoming resident at the Westerfield end tried to block it up and have it stopped on several occasions. Network Rail has also had it in for the pedestrian level crossing but have also failed to have it struck off.
Becoming a housing estate will clearly change this part of the Fonnereau Way, so I walked this to capture some pictures and soon to be historical sounds from the route. The farmland is intensively farmed and heavily sprayed as I’ve observed a few times, it’s quite possible that being turned into a housing estate may actually increase the biodiversity. Although the birds will be persecuted by hundreds of domestic cats and the gardens will no doubt be tiny, the farmland doesn’t support that many birds at the moment.
The Fonnereau Way starts from Christchurch Park, but I started where the changes will be made, where it crosses Valley Road. In the local plan all vehicle access will be from Henley Road rather than Valley Road.
and it’s a noisy place. It gets better quickly as the old path threads its way past some sports facilities and the playing fields
before reaching farmland
There are a few birds in the farmland, but to be honest the urban Brunswick Road Rec has more diversity to my ears, the birds are few and far between
I started redecorating the lab, so the EEG project is now relegated to an Autumn/winter project 😉 Which is a shame as I’d got close to replicating the Mind Mirror system in Open EEG and getting a hardware gizmo set up using a PIC. The best laid plans of mice and men…
It’s basically a single channel digital oscilloscope, but it works with Picotech’s Picoscope software, which has all sorts of features that are new to me, like software RS232 decoding, click to set trigger levels, and long persistence simulation.
I have a decent Tek 2245A analogue scope, which computes frequency and voltage levels from cursors on the traces,
This is now very old , from 1989. It does most of what I want/need, and most of my design career I worked with analogue ‘scopes, with the logic analyser as a separate piece of gear. However, despite its measly 100kHz bandwidth the Pico did show me some of the attraction of a more modern approach. Every so often I’ve toyed with the idea of getting a Chinese scope, something like Rigol 2000 series or similar. So far I haven’t cracked. There’s a lot to be said for a standalone scope, but I wonder if the combination of my regular analogue bench scope together with a Pico will be even better.
but it would be a terrible thing to do to give this to a beginner. I could only make this thing trigger properly because I’ve used analogue scopes for years and had some feel for what should happen – all too often on the FPGA scope if the vertical trigger wasn’t in range you simply don’t get to see anything useful at all, so you can’t see which way to shift the trigger point. And the user interface is revolting. Too much clickety-click of two separate left-centre-push-right buttons for my liking.
Picoscope is far better thought out although it still suffers from the problems of not enough control of input sensitivity and offset as a regular bench scope. But it, and the associated DC coupled arbitrary waveform generator will be a great tool for testing the OpenEEG filters at sub-audio frequencies. And unlike the typical fly-by-night USB scopes, the software supports legacy models back to when Pico started, because that is of course always the problem with any hardware that depends on a piece of software running on some other device – it easily becomes orphaned before its service life is over. See pretty much any hardware made by Apple that is more than three or four years old 😉
The DrDAQ does pretty much all that I want for the EEG work, but the AWG doesn’t support frequency sweep mode which is a shame. I’d need to go for something like the 2206B at £250 to get that. In that case I’ll probably do it the old way and set up the AWG to output a single frequency and step through the frequency range. What isn’t clear is the frequency resolution of the AWG.
The Meare at Thorpeness is only three feet deep and even a light breeze seems to rock these boats making a lot of noise.
A nice place in the summer – not so rammed with people as nearby Aldeburgh can be, and the boating lake is fun. Easy reach of the beach, too. The lake gets a good view of the whimsical House in the Clouds water tower
The Peter Pan-themed lake and the House in the Clouds are the creation of Scottish barrister Glencairn Stuart Ogilvie at the start of the 1900s
Now I have convinced myself that I can get a version of the OpenEEG hardware to run into EEGmir, I want to how see if I can reproduce one of the Cade-Blundell filters. I have an analogue simulation from earlier, and I want to see if I can reproduce this in EEGmir. The filter specification protocol in EEGmir is the same as in Fiview from Jim Peter’s site1, and since that displays the transfer function it looks like a good place to start.
a tale of linux graphical display woe…
The windows version doesn’t run, beats me why. So I try it on Linux. My most powerful Linux computer is an Intel NUC but because Debian is hair-shirt purist and therefore snippy about NDAs and proprietary drivers, I think it doesn’t like the graphics drivers. It was tough enough to get the network port working. Xserver and VNC is so deeply borked on that. If something is stuffed on Linux then it’s reload from CD and start again because I haven’t got enough life left to trawl through fifty pages of line noise telling me what went wrong. So I’m stuck with the command line. So I try fiview on the Pi, and this fellow sorts me out on tightVNC and the Pi which is a relief, trying to get a remote graphical display on a Linux box seems to be an endless world of hurt, and I only have a baseband video monitor on the Pi console.
Simulating the 9Hz Blundell filter
I already have SDL 1.2 on the Pi, so it goes. Let me try the 9Hz channel, which was the highest Q of the Cade-Blundell filters. If you munge the order and bandwidth specs you get fc=9Hz BW=1.51.
Converting that to Fiview-speak that is
fiview 256 -i BpBe2/8.22-9.72
which in plain English means simulate a sampling rate of 256Hz bandpass Bessel 2nd order IIR between 8.22 and 1.51. So let’s hit it.
Unfortunately the amplitude axis is linear, which is bizarre. Maybe mindful of their 10-bit (1024 level) resolution OpenEEG didn’t want to see the horror of the truncation noise and hash. I can go on Tony Fisher’s site (he wrote the base routines Jim Peters used in fiview) and have another bash
Running the analogue filter with the same linear frequency display I get
which shows the same response2. H/T to the bilinear transformation for that. I had reasonable confidence this would work, I did once cudgel my brain through this mapping of the imaginary axis of the s plane onto the unit circle when I did my MSc. Thirty summers have left their mark on the textbook and faded the exact details in my memory 😉 But I retained enough to know I’d get a win here.
It’s not strictly exactly the same because of the increasing effect of the frequency warping of the bilinear transformation as the frequency approaches fs/2. But in practice given the fractional bandwidth of the filters the warping only has an effect in giving the upper stopband a subtly different shape in the tails, I struggle to see it here. ↩
Now I can get signals into the OpenEEG modP2 format, the next stage is to qualify the filtering used within eegmir and to put an antialiasing filter in front of the ADC. The sampling rate is only 256Hz, so the highest frequency possible is 128Hz. Anything else will alias down, particularly frequencies +/- 50Hz of 256Hz, which will be aliased down to 0-50Hz and corrupt my area of interest. This includes the fourth, fifth and six harmonics of the 50Hz power frequency and the second harmonic of the 100Hz full-wave rectifier ripple tossed onto the powerline by every switched-mode power supply in the neighbourhood.
OpenEEG are good enough to put their schematic up on the Web, so I simulated their antialiasing filter.
Hmm, colour me underwhelmed. At a 10-bit resolution the steps are 1/1024, so quantisation noise is 20×log(1/1024) or about -60dBFS. So you’d like to be 60dB down at fs/2 of 128Hz, which is where I’ve drawn the line. We are at, …drum roll…, -16dB by then. At least the crap there gets aliased to the high frequencies, but by fs we are at -26dB. Nice try, but no cigar. I guess that’s the price I pay for saving myself the grunt of lining up all those analogue filters. TANSTAAFL and I get to try harder here. At least there are only two of these filters.
Elliptic filter design
The obvious way here would be to get an elliptic filter and target a notch at fs/2 and another at fs. I had thought there would be an online calculator by now, but perhaps nobody makes analogue filters any more1. So it’s back to the Williams book. It’s all about the ratio between stopband and passband. The stopband is non-negotiable at fs/2, say 120Hz so hopefully a notch will be dropping just beyond that into 128 Hz. I have flexibility on the passband, the Mind Mirror goes up to 38Hz, say I choose a passband cutoff of 60Hz, I get a steepness of 2. I’m easily prepared to take a passband ripple of 0.3dB (p=25%)2 so I am after a C ?order 25 ?theta
From Table 2-2 I want Θ=30° for my steepness of 2, so I want a C ? 25 30 filter, with only the order to determine. I’d really like that to be 3 rather than 5 😉 Sadly I look up C 03 25 283 and the stopband is only 30dB. Shifting Θ=20° would give me a steepness of 3 and a stop of 40dB, so my passband comes down to 40Hz
A C 05 25 32 would give me a stop of 60dB, I will give some of that up in component tolerances, but it’s better than 16dB and gives me some chance to fight all that mains rubbish, so let’s take a look.
It’s not bad. I’d probably want to shift the corner frequency down by 5Hz. It’s good that it isn’t anywhere near as sensitive to component values as the Cade-Blundell bandpass ones were, the shifts due to preferred values were significant but the traces are close. For comparison the original OpenEEG line is in blue. The filter is complex, but not terrible, I can take solace that this is the quid pro quo for not having to line up all those 54 filter centre frequencies 😉 Continue reading “Modding the OpenEEG analogue to digital converter and comparing with OpenBCI”
So far I have inched my way to making a Mind-Mirror compatible EEG in a theoretical way, but to make it work in real life I need a way of getting signals into the machine. You can buy a board made by Olimex for a reasonable £50, you get optoisolation and everything, and it’s probably the most cost-effective way. Trouble is I don’t know that EEGmir works yet, so I want to do it cheaper, and also now. A Microchip PIC16F88 will do the job here, and I have a few 🙂
I tinkered using this SPBRG calculator to find a suitable crystal to run the PIC16F88 at to match both the 256Hz sampling rate and the baud rate. The first run of EEGmir showed me nothing at all.
Inquiring further it seems the Raspberry Pi gets shirty about a 3% baudrate error at 57600 baud. I set up a test PIC to pump out an endless string of As, and when I brought up minicom they showed up as Ps. This is not good.
I needed to go find a 3.6864 MHz crystal, which lets you get down to 0% error at 57.6k, and by a fortunate stroke of luck fc/4 divides down integer-wise to 256Hz. Nice. So I did that, sending a bunch of As in the data frames to the Pi, after padding down the 5V TTL signal from the PIC.
Mincom showed the As OK from the test PIC, but it wouldn’t let go of the TTY until I rebooted. EEGmir comes up and shows me a load of gobby stuff about data errors. Pressing F12 shows it is assessing jitter
and telling me I have a sampling rate of 325Hz. The nice thing about hardware is you can get a second opinion. Sometimes it’s the smoke pouring out of something, but here it’s in the frame rate of the signal, as I gave myself a sync pulse on a spare PIC pin to synchronise my scope to. So I appeal the outrageous assertion that I am running too fast
and get handed down the verdict of guilty as charged, I did screw up. And I didn’t wait for the camera to focus.
Let’s look on the bright side. This PIC is sending out data at the right baud rate, sort of the right number of frames, too damn fast. And EEGmir is reading from the Pi serial port and struggling manfully to make some sense of it. The (256Hz) on the jitter display even gives me hope it might adapt if I choose to run at 128Hz. Oh and I find that the escape key is the quit command in EEGmir, which saves having to go find the PID and do a kill-9 PID on it, which always feels a bit bush league.
The sampling rate error is because I failed to wait for the TMR1 to time out which I was using to define the frame rate, doing that fixed the sampling rate, it’s now 256.04 according to EEGmir. Still hollering about data errors, so I probably failed to understand the OpenEEG2 protocol somehow. Continue reading “OpenEEG2 ADC”