Everybody knows the story of Little Bo-Peep who lost her sheep. It is heartbreaking! I am terrified to imagine my three-year-old in that situation. Julia’s tears will be endless … I will be quite shaken up by the whole thing too … But this is my daughter and I, a bunch of softies. My neighbor, a farmer up the road would not be crying if a few of his cows went astray. He would pull himself together, mutter a few choice words in direction of whoever left the gate open and start looking for the cows. And he would find them! It might take a bit of time, but I would not doubt Michael for a second. Though, is there a way for IoT to help? I am sure Michael has a million and one things to do other than looking for animals that should not have been lost in the first place. Luckily, I have been working on localization of farm-IoT devices and will tell you all about it. We even published an article on this, which is cited below and has all of the details.
When it comes to localization of animals, the farmer can choose form a number of options. There are GPS-based solutions (animal collars mainly) that can tell almost exactly where the animals are. Those collars would be ideal if only GPS was not such an energy hungry technology. If you use a collar like that, you’ll either have to change a lot of batteries or rely on some sort of remote charging technology (e.g. solar, also not ideal in cloudy and rainy Ireland). As an alternative, you can go for an anchor-point system. A bunch of devices (i.e. infrastructure) is installed on your farm, communicates with an animal collar, which is then located via a sort of advanced triangulation technique. It solves the energy problem, yet you have to install and maintain those devices on your farm, a clear minus of the technology. So with the two previous technologies out, we are left with just one more option, Activity Sequence-Based Map Matching or ASMM for short. In my eyes, this is a clear winner.
Figure 1: Context awareness for cow-localization
Activity Sequence-Based Map Matching, a bit of a mouth-full for a name, though the technique itself is not that difficult. Think of a typical day of a cow on a dairy farm. It is quite regimented (see Fig. 1). The cow either eats/sleeps/digests in a field (paddock is the formal term), goes to/comes from milking or is being milked in the parlor. So, realistically, most of the time the cow is either in a field or milking parlor. And not even any field, clever drafting gates and other farm tech make sure that the cows go through the fields in a certain order, so the grass has time to grow before its eaten again. So if we know what the cow has been up to so far and what fields (and their order) the cow was scheduled to go to today, surely we’ll be able to tell where the cow is now.
Say today the cow can go to paddocks 1,2,3. It’s evening time and the cow has been milked twice and is headed to a field for the night. So soon enough the cow will be in paddock 3, ASMM in action. We know activities of the cow (e.g. in paddock, in transit, being milked) and we match those to the map of the cow’s planned movements for the day. In the end we know the location. Simple you say! Well, not really, knowing the animal activity (i.e. Context-Awareness) is quite tricky from the technological point of view.
Figure 2: Gaussian Mixture Effect and its impact on (a) Signal Distribution (b) Windowed Mean (c) Windowed Variance
Being in a field, milking parlor, and transitioning between the two are high-level activities, which inertia measurements come from Mixed Gaussian distributions (see Fig. 2). These distributions (especially if unbalanced) are noticeable harder to separate using Windowed Mean and Variance, the two metrics commonly used by other authors, which often focus on low-level activities (e.g. standing, moving, turning etc) whose inertia measurements are Gaussian. Though, we think we built a solution for that too. We show that using Windowed Max and Min unbalanced Mixed Gaussian signal can be separated with a sufficient accuracy. So the high-level activities can be accurately detected and Michael does not need long to find his cows!
We got your back, Michael! Well… only if you use our animal collars…
Publication Title: Leveraging Fog Analytics for Context-Aware Sensing in Cooperative Wireless Sensor Networks
Authors: Kriti Bhargava, Stepan Ivanov, Diarmuid McSweeney, William Donnelly.
Journal: ACM Transactions on Sensor Networks (TSON), vol. 15, no. 2, 2019.