SEA Piper personal projects - II

This article describes the finalist projects of the SEA Pied Piper competition. For a detailed look into the rest of the projects visit the previous post.

Lawrence set out a difficult goal for himself, to find alien life in space. For that purpose he built a probe with 5 IoT elements which transmits data back to Mother Earth but at the same time is able to react locally to certain thresholds following the principles of IQT. Not only he made full use of Redis and ECS but he also extracted data from an external API and even used of GCP's Text-to-Speech service to generate spoken alerts on the pretended Earth's Control Center. Lawrence also stole his children's Lego to build a great looking casing for his project. He run out time to add some solar panels for deep space trips ... but apart from that his was a well rounded project that earned him the third position with the same points as Chris.


Chris focused heavily on the IoT side of things. He managed to make use of 10 sensors spread across two Raspberry PI's including an LCD display with I2C interface, accelerometer and a 8 x 8 LED matrix. His project was one of the little ones that experimented with graphing the data that was stored in the Redis database. He also managed to cover the basics of object storage with ECS. This is what his IoT setup looks like.


Thang's project "Pigate" featured the most comprehensive use of serverless out of any project. He successfully implemented a prototype of smart gate driven by a servo that was able to detect number plate by using AWS Rekognition and the Lambda ecosystem. His project had a sophisticated MQTT workflow that even makes the messages flow back to the edge.


It is a very strong project from an IoT perspective and the only one to use of a servo. You can see a demo of Pigate below.


Thang and LiJwee got the same points and ended the competition in second position. Not happy with one, LiJwee presented two big projects so that he could demonstrate more skills. In the first one he did a fully functional "Piper" radio that by leveraging every conceivable Python library was capable of tuning multiple radio stations, regulate volume, send tweets with your favourite songs and generate QRcodes in ECS with links to Youtube channels. An I2C screen provides visual feedback locally and impeccable looking web app in Cloud Foundry shows what songs have been tweeted and allows you to add new radio stations.


For his radio LiJwee delivered undoubtedly the best finished product out of any other project, with not a single cable on sight.


LiJwee's second project, "Boost Finder" is an interface into the Elysium database that holds Data Domain's autosupports. It retrieves Elysium data in JSON format and stores the processed file in MongoDB. This allows him to retrieve information that helps him in his day-to-day job as a talented DPS Systems Engineer. This is another great example of what Pied Piper can achieve by empowering our SE's to create tools that integrate with our products or systems to either improve a business process or to help our customers derive more value from our products

Now, let's look into the first position where we also had a draw between TanLong and Raghava. The amount of effort behind these two projects is simply phenomenal. Both of these individuals used great skill and sacrifice enormous amounts of personal time for 6 weeks to produce a result is out of the charts.

TanLong set out to build a Smart Greenhouse. His prototype collected data of environment variables that are relevant to plants such as soil moisture, temperature, brightness and so on and ships them via MQTT to a Big Data worklow. This workflow uses Apache NiFi, a Hadoop database underpinned by an Isilon with a HDFS interface and is being analized with Apache Zepelin and graphed in real-time with Highcharts. TanLong's project is the one with the biggest amount of integrations with external services

In an effort to justify even more skills the greenhouse features a sophisticated security system which uses the full serverless ecosystem from AWS with SNS email alerts included. But not happy with using AWS Rekognition he also got Azure's Cognitive service to provide a second opinion on the images being analized. As you can see the fan and the light in the greenhouse prototype can be actioned remotely from the web app running in Cloud Foundry. AWS Polly was shouting alert messages loud and clear.


TanLong did not just use an external Rest API to augment the data of the solution but he also built his own. Most of his solution lived in the Singapore Solutions Centre (and more than likely will continue to live there) so he built a Rest API to be make it easier to bridge to the public cloud.

The other winner is Raghava. He is the only member of the team that managed to tick each and everyone of the skills in the evaluation matrix. When one is trying score so many points it is very hard to come with a cohesive story that blends all the different technologies. Raghava told a story of a large multinational that operates in many verticals and is looking for a large prototype to showcase our ability to help them with their transformation.

To start with, the IoT part of his solution had 11 elements connected to the same Raspberry PI which for some inexplicable reason decided not to catch fire. All those sensors where ingeniously arranged in a plastic container with holes drilled on it. The lid for the container could be closed in place and made the solution surprisingly portable
The data then was flowing to Redis and MongoDB in PWS as well as to ECS and to Splunk. Raghava used two different brokers, RabbitMQ in PWS and Mosquitto inside a VM in Azure cloud. Raghava used Vagrant to deploy his solution and some components where installed with Ansible, which gives him the privilege of being the only one that used that skill from the course.

The main page of the web app was also very nicely done with Bootstrap CSS and includes a photo carousel.

Another section of the app displays page a world map that shows the status at all sites the prototype is supposed to monitor. This map gets updated via Javascript.


Finally Raghava also used a comprehensive workflow based on Lambda, S3 and Rekognition to analyze the photos taken by the webcam. The solution included AWS Polly text-to-speech to generate warning messages.

Thanks to the SEA Team for all the great effort and commitment to the program, and congratulations to all the winners.

Comments

Popular posts from this blog

Sending PowerStore alerts via SNMP

Sending PowerStore logs to Syslog

Electronic Nose - eNose