EE5111 Selected Topics in Industrial Control & Instrumentation

Assessment:

Implement a simple IoT pipeline with AWS Cloud platform and Visualise the data.

Author:

Zhan Yuancheng (A0206839W)

Github:

https://github.com/nkzyc/EE5111_A0206839W

Medium post:

https://www.cnblogs.com/nkzyc/p/11561289.html

1.Introduction. 3

1.1 IOT Background. 3

1.2 Goals of the Assessment. 5

1.3 Prerequisites. 5

2.Sending data to AWS by Official routine. 6

2.1 Create the AWS IoT Policy. 6

2.2 Create the things. 7

2.3 Email Alerts for Low Water Content Readings. 8

2.4 Simulate random water content. 10

2.5 Results. 11

3.Sending data FD001 to AWS. 11

3.1 Introduction of the database. 11

3.2 Create the AWS IoT Policy for FD001. 12

3.3 Create the things of FD001. 13

3.4 Create AWS IoT rule and AWS DynamoDB. 14

3.5 Use python codes to meet the requirements. 17

3.6 Results and Analysis. 19

4. Sending data from FD001 and FD002 to AWS. 20

4.1 Create the Policy, Thing and DynamoDB for FD002. 20

4.2 Create IoT rule for both FD001 and FD002. 22

4.3 Results and Analysis. 22

5.Visualization of data. 23

5.1 Get AWS access key. 23

5.2 Redash environment. 24

5.3 Obtain and Visualize the data. 25

5.4 Examples of Visualization by Redash. 26

6.Anylasis of Motor Vehicle Inspection data source. 28

6.1 Introduction of data source. 28

6.2 Create the Policy, Thing, Rule and DynamoDB. 29

6.3 Data processing and Results. 31

6.4 Visualization of data source. 33

7.Conclusion. 36

8.Appendix(completed python codes) 36

1.Introduction

1.1 IOT Background

IOT

The Internet of Things (IoT) is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.

The definition of the Internet of things has evolved due to the convergence of multiple technologies, real-time analytics, machine learning, commodity sensors, and embedded systems. Traditional fields of embedded systems, wireless sensor networks, control systems, automation (including home and building automation), and others all contribute to enabling the Internet of Things. In the consumer market, IoT technology is most synonymous with products pertaining to the concept of the "smart home", covering devices and appliances (such as lighting fixtures, thermostats, home security systems and cameras, and other home appliances) that support one or more common ecosystems, and can be controlled via devices associated with that ecosystem, such as smartphones and smart speakers.

There are a number of serious concerns about dangers in the growth of IoT, especially in the areas of privacy and security; and consequently industry and governmental moves to begin to address these.

AWS IoT

AWS IoT provides secure, bi-directional communication between Internet-connected devices such as sensors, actuators, embedded micro-controllers, or smart appliances and the AWS Cloud. This enables you to collect telemetry data from multiple devices, and store and analyze the data. You can also create applications that enable your users to control these devices from their phones or tablets.

AWS IoT consists of the following components:

Device gateway

Message broker

Rules engine

Security and Identity service

Registry

Group registry

Device shadow

Device Shadow service

Device Provisioning service

Custom Authentication service

Jobs service

AWS IoT integrates directly with the following AWS services:

Amazon Simple Storage Service

Amazon DynamoDB

AWS Lambda

Amazon Simple Notification Service

Amazon Simple Queue Service

And so on

1.2 Goals of the Assessment

The whole information of the goals are on the github:

https://github.com/iceberg12/NUS_guest_lecture/tree/master/EE5111

From my report, I achieve the following goals:

1.Follow the AWS instruction to understand how to populate, send data from a computer to AWS through MQTT protocol. (Optional)

2. Publish pre-defined engine data to AWS.

3. Simulate two IoT things.

4. Visualize the data for the two engines for all the sensors.

5. Use other data sources.

All of the goals I have achieved will be implemented in the following report with detailed description.

1.3 Prerequisites

I will first explain The tools and environment configuration I used in this task are following. Please refer to the corresponding official documentation for the specific installation steps.

Environment:

Windows 8.1

Python 3.5.4

AWS IOT core

Tools:

Pycharm 2017

Anaconda 3

Datadog

Redash

2.Sending data to AWS by Official routine

In the Official routine, there is a tutorial of Plant Watering with AWS IoT. This will help us to understand how to send data to AWS with our computer.

This control of Plant Watering shows us how to use a moisture sensor and AWS IoT to monitor the soil moisture level for a house plant or garden. We can create a rule in AWS IoT that sends an email to an address subscribed to an Amazon SNS topic when the moisture level falls below a threshold.

There are five steps as follow.

2.1 Create the AWS IoT Policy

In this step, create an AWS IoT policy that allows our sensor to connect and send messages to AWS IoT. We should do the following:

Sign in to the AWS Management(https://aws.amazon.com) and open the AWS IoT console; enter the IoT Core and select the policies as shown in Figure2.1

Figure2.1 create policy1

select Create Policy and provide a name PlantWateringPolicy; Enter iot:* for action and * for ARN as shown in Figure2.2;

Figure2.2 create policy2

2.2 Create the things

In this step, we will create one of the AWS IoT in things , which is used a device simulator. Devices connected to AWS IoT are represented by things in the AWS IoT registry.

Select Manage and choose Things and use Create; choose to create a single thing as shown in Figure2.3.

Figure2.3

Give the name MyRPi and select Next; On the Add a Certificate page, choose Create Certificate as shown in Figure2.4

Figure2.4

For the certificate of the thing , choose to download; Select Activate; For the policy of adding things , select PlantWateringPolicy. Then our thing has been created as shown in Figure2.5.

Figure2.5

2.3 Email Alerts for Low Water Content Readings

In this step, we set up Amazon Simple Notification Service (Amazon SNS) to automatically send an email alert to the owner of the green plant when the soil moisture content is too low, alerting the watering.

Create an AWS IoT rule to trigger email alerts through Amazon SNS. Select Actions and select Create a rule; Enter a name for this rule MyRPiLowMoistureAlertRule; For rule query statements, enter the following AWS IoT SQL statement.

SELECT* FROM '$aws/things/MyRPi/shadow/update/accepted' WHERE state.reported.moisture = 'low'

For setting one or more actions , select Add Action choose to send the message as an SNS push notification; All are as shown in Figure2.6

Figure2.6

Set up Amazon SNS to send messages to my email inbox via an Amazon SNS topic; For the topic ARN , select the ARN for the topic; For the agreement , select Email and enter my email address.

All are as shown in Figure2.7

Figure2.7

2.4 Simulate random water content

In this step, we can then press these readings to the relevant shadow in the AWS IoT. When the reading is too low, Amazon SNS automatically sends an email alert to the owner of the green plant.

Create a new file on the development machine using the code (plantwater.py) in the Github as shown in Figure2.8

Figure2.9

Use the pycharm to run the code.

2.5 Results

My NUS email has received the alerts from AWS as shown in Figure2.9, which indicate our first goal of sending data from a computer to AWS through MQTT protocol is successful.

Figure2.9

3.Sending data FD001 to AWS

3.1 Introduction of the database

The data provided is from a high fidelity system level engine simulation designed to simulate nominal and fault engine degradation over a series of flights. The flights are full flight recording sampled at 1 Hz and consists of 30 engine and flight condition parameters. Each flight contains 7 unique flight conditions for an approximately 90 min flight including ascent to cruise at 35K ft and descent back to sea level. The structure of the plane sensor is as shown in Figure3.1.

Figure3.1

FD001 and FD002 are input data for machine fleets. Sensor names can be found in CMAPSS file. The column names for the Text Input Files are: id, cycle, os1, os2, os3, sensor1, ...., sensor22.

3.2 Create the AWS IoT Policy for FD001

This step is similar to the 2.1 Create the AWS IoT Policy in the report. I will show the change between them.

Sign in to the AWS Management(https://aws.amazon.com) and open the AWS IoT console; Enter the IoT Core and select the policies.

select Create Policy and provide a name Mypo1_A0206839W; Enter iot:* for action and * for ARN as shown in Figure3.2;

Figure3.2

3.3 Create the things of FD001

This step is similar to the 2.2 Create the things in the report. I will show the change between them.

Select Manage and choose Things and use Create; choose to create a single thing;

Give the name Mything1_A0206839W and select Next; On the Add a Certificate page, choose Create Certificate as shown in Figure3.3

Figure3.3

For the certificate of the thing , choose to download of the public key , the private key, and the root CA of the AWS IoT.

Select Activate; For the policy of adding things , select Mypo1_A0206839W as shown in Figure2.4.

 

Figure3.4

Then Mything1_A0206839W has been created as shown in Figure3.5.

Figure3.4

3.4 Create AWS IoT rule and AWS DynamoDB

In this step, we need to use DynamoDB rules to collect information from MQTT message, and send those information to the DynamoDB database, where we can analyze the them.

Select Actions and select Create a rule; Enter a name for this rule Role1; For rule query statements, enter the following AWS IoT SQL statement as shown in Figure3.5

SELECT state.reported.* FROM '$aws/things/ Mything1_A0206839W /shadow/update/accepted'

Figure3.5

For setting one or more actions , select Add Action choose to as shown in Figure3.6

Figure3.6

On the Configure action page, choose Create a new resource and go to the Amazon DynamoDB page, choose Create table as shown in Figure3.7.

Figure3.7

On the Create DynamoDB Table page, enter a name A0206839W. Choose 'id' as the partition key and 'timestamp' as the sort key as shown in Figure3.8

Figure3.8

choose Create to create table A0206839W as shown in Figure3.9

Figure3.9

Back to the configure action; in Create a new role , enter the name role_A0206839W and use Create as shown in Figure3.10

Figure3.10

Select Create Rule to create a rule, the rule Role1 as shown in Figure3.11 .

Figure3.11

3.5 Use python codes to meet the requirements

3.5.1 Requirement

The requirements of the Publish pre-defined engine data to AWS(Step 2) is as follow:

a. Download data for engines FD001 and FD002 from Github .

b. Modify the Jupyter notebook in Step 1 to read and publish data from trainFD001.txt to your thing under AWS IoT platform at the rate of 10 seconds per row. Overwrite column 'id' of the engine as 'FD001' + id; add one more columns 'timestamp' as timestamp in UTC; also add one more column that contains your Matric number.


c. Setup an Act under AWS IoT platform to define a rule that splits the message into columns of a DynamoDB table.

d. Check under AWS DynamoDB service if your table is populated correctly.


3.5.2 Methods

I design the method to satisfy all of the requirements step by step, which will show in the following.

The completed codes are in the appendix(main1.py)

For the a.,

I has download from the Github;

For the b.,

First, I design some codes to meet the specific requirement:

Read and publish data from FD001.txt under AWS IoT platform at the rate of 10 seconds per row

time.sleep(10)

Overwrite column 'id' of the engine as 'FD001' + id

content.append(str('"' + 'FD002_' + x[0] + '",'))

Add one more columns 'timestamp' as timestamp in UTC

tim = str(datetime.datetime.utcnow());#tim= timestamp

Add one more column that contains your Matric number

maNum = 'A0206839W'

Second, I design two function to achieve specific form

# to form the title of the column of the content

def form_title ():

#change
date into joson form to send

def form_joson (content,x,title,tim,maNum):

Third, the codes of reading file FD001.txt and sending it to DynamoDB is as follow:

fp = open('train_FD002.txt', 'r')
#title which satisfy
the request by teacher
title=form_title ();
for reader in fp.readlines():
    x = reader.split(" ")
    #time
   
tim = str(datetime.datetime.utcnow());
    #the content of the message
   
content = [];
    #the joson form of data to send
   
send=form_joson(content, x, title, tim, maNum)
    print(send)
    #send to the dynamnoDB
   
myDeviceShadow.shadowUpdate(send, myShadowUpdateCallback,
5)

time.sleep(10)
fp.close();

For the c.,

I have finish it on the 3.4 create AWS IoT rule and AWS DynamoDB

 

For the d.

I will show the outcome in the 3.6 Results and Analysis

3.6 Results and Analysis

Results

The results is shown as Figure3.12.

Figure3.12

Analysis

From the results, we can conclude the
following things:

1. the DynamoDB table A0206839W has received the data sent by Mything1_A0206839W successfully,
which indicates the whole operations of AWS IoT are correct.

2. By subtracting one line, the ‘id’ is FD001_1, the ‘timestamp’ is 2019-09-20
11:38:57.733647, and the other ‘sensorx’
is the same as the ‘FD001.txt’ data,
which indicates the python codes satisfy the requirements.

4. Sending data from FD001 and
FD002 to AWS

We
are going to simulate two IoT things.
Therefore, we need to use another IoT thing to complete.

As the whole processes of another thing are
similar to the 3.sending data FD001 to
AWS,
my report will omit duplicates.

4.1 Create the Policy, Thing and DynamoDB
for FD002

4.1.1 Policy

Select Create
Policy
and provide a name Mypo2_A0206839W; Enter iot:* for action and * for ARN, the policy is as shown in Figure4.1

Figure4.1

4.1.2 Thing

Select Manage and choose Things and use Create; choose to create a
single thing;

Give the name Mything2_A0206839W;
Mything2_A0206839W
has been created as shown in Figure4.2.

Figure4.2.

4.1.3 DynamoDB

As we need to simulate the two
"things" to run in parallel, we have to use the same DynamoDB(A0206839W) to the first thing.

4.2 Create IoT rule for both FD001
and FD002

We need to modify the rule
Role1
to meet the requirements of step3.
To be specific, we need to change the rule query statements as follow:

Change

SELECT state.reported.* FROM '$aws/things/ Mything1_A0206839W
/shadow/update/accepted'

Into

SELECT state.reported.* FROM '$aws/things/ +
/shadow/update/accepted'

In that way, the rule can choose both Mything1_A0206839W and Mything2_A0206839W
to the DynamoDB and we can simulate the two "things" to run in
parallel.

The updated Role1 is shown
in Figure4.3

.

Figure4.3

4.3 Results and Analysis

The completed codes of Mything2_A0206839W are in the appendix.

I use the two .py(main1.py
and main2.py in the appendix) run at the same time, which is aimed at stimulating
the two "things" to run in parallel to publish data.

Results

The results in DynamoDB are shown as Figure4.4

.

Figure4.3

Analysis

From the results, we can conclude the following things:

1. the DynamoDB table A0206839W has received the data sent by Mything2_A0206839W successfully, which
indicates the whole operations of Mything2_A0206839W
are correct.

2. By studying the second figure of Figure4.3, we find that the DynamoDB has received both FD001 and
FD002 data and stored them individually. This indicates my methods satisfy the
requirements of step3.

5.Visualization of data

5.1 Get AWS access key

Go
to the AWS account and click the zyc18 in the Users as shown in Figure5.1

Figure5.1

Add
the Add permissions and select AmazonDynamo Db as shown in Figure5.2

Figure5.2

Download the access keys from
it as shown in Figure5.3

Figure5.3

Then we can use those keys to permit us to
subtract data from dynamoDB table.

5.2 Redash environment

Introduction

Redash is an Open Source Company to
democratize data and make data driven decision making easy.

We
can query Amazon DynamoDB by using its natural syntax, enjoy live auto-complete
and explore Amazon DynamoDB schema easily in Redash's cloud-based query editor;

And we can visualize Amazon DynamoDB data and gather it into
thematic dashboards from multiple sources, share the story your data tells with
team or extermal partners.

Configuration

On the Redash home page(https://app.redash.io), open the New Data Source page and choose to Create a New Data Source, and select DynamoDB;

Put the Name, Access Key and Secret Key of our AWS DynamoDB as shown
in in Figure5.4.

Figure5.4

5.3 Obtain and Visualize the data

Obtain

Open
the Queries page and use SQL
language to query data from the DynamoDB
table
as shown in Figure5.5

Figure5.5

Now we can obtain the data from the DynamoDB table of A0206839W

Visualize

Open the New Visualization to visualize the data as shown in Figure5.6

Figure5.6

We can use this tools to visualize data
which we want to study.

5.4 Examples of Visualization by Redash

Using this visualization, I choose several
representative examples to show the relationship between the data.

Chat line

I take
time
as X Colum and os1,os2,os3 as
Y Colum, the outcome as shown in Figure5.7

Figure5.7

Chat bar

I take time
as X Colum and sensor1, sensor2, sensor3
as Y Colum, the outcome as shown in Figure5.8

Figure5.8

Chat Scatter

I take time
as X Colum and sensor16, sensor17,
sensor18
as Y Colum, the outcome as shown in Figure5.9

Boxplot

The outcome of all the columns as shown in
Figure5.10

6.Anylasis of Motor Vehicle
Inspection data source

6.1 Introduction of data source

The data source is the about annual motor vehicle inspection - passing
pate of motor vehicles on first inspection from the https://data.gov.sg.

The columns of the data source are shown as Figure5.1

Figure5.1

Some samples of the data are
shown as Figure5.2

Figure5.2

The goals of my IoT are to collect
those source from specific monitors, store
them in the AWS DynamoDB and do the
analysis
through the Visualization.

6.2 Create the Policy, Thing, Rule
and DynamoDB

As the whole processes are similar to the past
operation, I will omit duplicates and point to the emphasis.

6.1.1 Policy

Select Create
Policy
and provide a name Mypo2_A0206839W;
Enter iot:* for action and * for ARN, the policy is as shown in Figure6.1

Figure6.1

6.1.2 Thing

Select Manage and choose Things and use Create; choose to create a
single thing;

Give the name Mypas_rate_A0206839W;
The thing has been created as shown in Figure6.2.

Figure6.2

6.1.3 DynamoDB

On the Create DynamoDB Table
page
, enter a name pass_rate. Choose
'year' as the partition key and 'timestamp'
as the sort key. The DynamoDB table as
shown in Figure6.3

Figure6.3

6.1.4 Rule

Select Actions
and select Create a rule; Enter a name Rule_pasrate for this rule Role1;

For setting
one or more actions
, select Add
Action
and choose the DynamoDB table

The rule has been created as shown in
Figure6.4

Figure6.4

6.3 Data processing and Results

We
need to change the .csv data into .json data. In addition, in order to show
more clearly, we need to add one more column 'timestamp' as timestamp and add
one more column that contains Matric number.

The
completed codes are in the appendix(main3.py)

Methods

Change the .csv data into .json data

with open("pas_rate.csv", "r", encoding="utf-8") as fp:
    # reader是一个迭代器
   
reader = csv.reader(fp)
    # 执行一次next,指针跳过一位,可以不获取标题
   
next(reader)
    for x in reader:
        content = [];
        tim = str(datetime.datetime.utcnow());
        content.append(str(title[0] + ':'))
        content.append(str('"' + x[0] + '",'))
        for i in range(1,8):
         content.append(str(title[i] + ':'))
         if i==1:
          content.append(str('"' + tim + '",'))
         elif i==2:
            content.append(str('"' + maNum + '",'))
         else:
             content.append(str('"' + x[i-2] + '",'))
        cc=''.join(content)[:-1]

Add one more column 'timestamp' as timestamp

tim = str(datetime.datetime.utcnow());#tim= timestamp

Add one more column that contains Matric number

maNum = 'A0206839W'

Results

The results in DynamoDB pass_rate are shown as Figure6.5

Figure6.5

Analysis

From the results, we can conclude the following things:

1. the DynamoDB table pass_rate has received the data sent by Mypas_rate_A0206839W
successfully, which indicates the whole operations of Mypas_rate_A0206839W are correct.

2. We find that the pass_rate has the following columns:

'year','timestamp','MatricNumber','type','age',‘number_reported','number_passed','passing_rate'
with correct data.

This indicates my methods satisfy the requirements of step5.

6.4 Visualization of data source

Using the same methods of 5.Visualization
of data
, we can study the annual motor vehicle inspection by Redash and visualize them.

6.4.1 Obtain the data

Open the Queries page and use SQL language to query data from the DynamoDB
table pass_rate as shown in Figure6.6

Figure6.6

6.4.2 Visualize and Analyze the data

Open the New Visualization to visualize the data. We can use this function
to draw some conclusions which may
help us to determinate vehicles.

I take type
as X Colum and passing_rate as Y
Colum, the outcome as shown in Figure6.7

From the figure we can conclude that Other Vehicles has the lowest pass
rate and Taxi has the highest pass rate.

Figure6.7

I take year
as X Colum and passing_rate as Y
Colum, the outcome as shown in Figure6.8

From the figure we can conclude that pass rate become better by time, which
indicate the vehicles are more and more reliable.

Figure6.8

I take age
as X Colum and reported and passed
as Y Colum, the outcome as shown in Figure6.9

From the figure we can conclude thatmost people love to examine their
vehicles within 3 years..

7.Conclusion

I completed the following goals that the teacher requested:

1. Understand
how to populate, send data from a computer to AWS through MQTT protocol.

2.
Publish pre-defined engine data to AWS.

3. Simulate two IoT things.

4. Visualize the data.

5. Use different data sources to analyze.

In addition, I have the following benefits:

1.
Master the skills of how to quickly access knowledge from official documents

2.
Master more python and java code methods

3.
Learned a lot of new features on AWS to prepare for future use

4.
Have a new understanding of the entire control system

Finally, thanks Dr. Nguyen Hoang Tuan Minh for your knowledge and guidance.

EE5111_A0206839W的更多相关文章

随机推荐

  1. 回头看看HTML5

    前言:自从学习各种框架各种成熟的控件库,越来越觉得疲惫. 一.用语义元素构造网页 在html5中最常用到的页面结构相关的语义元素如下: 页面结构想相关的语义元素 元素 说明 <article&g ...

  2. stdio - 标准输入输出库函数

    SYNOPSIS 总览 #include <stdio.h> FILE *stdin; FILE *stdout; FILE *stderr; DESCRIPTION 描述 标注 I/O ...

  3. windows API 第22篇 WTSGetActiveConsoleSessionId

    函数原型:DWORD WTSGetActiveConsoleSessionId (VOID)先看一下原文介绍: The WTSGetActiveConsoleSessionId function re ...

  4. 新手创建多module mvn工程

    1.创建工程 创建一个mvn工程有两种方式,一种是通过命令创建,一种是通过idea去一步一步配置. 1.1 命令模式 mvn archetype:generate -DgroupId={groupId ...

  5. 浅谈异步上传插件 jquery-file-upload插件

    当我们需要异步上传文件的时候,我们倾向于在网上查找相关的JQuery插件,jquery-file-upload就是我们经常看到的,但是他的主页是英文的,对于我们这些英语比较差的同学来说,简直就是... ...

  6. 数学思维——cf1244C

    可惜cf不能用int128,不然这个题就是个exgcd的板子题 这是exgcd的解法,但是只用ll的话会溢出 #include<bits/stdc++.h> using namespace ...

  7. 学习Caffe(一)安装Caffe

    Caffe是一个深度学习框架,本文讲阐述如何在linux下安装GPU加速的caffe. 系统配置是: OS: Ubuntu14.04 CPU: i5-4690 GPU: GTX960 RAM: 8G ...

  8. 网关协议:CGI和WSGI

    通常服务器程序分为web服务器和应用程序服务器.web服务器是用于处理HTML文件,让客户可以通过浏览器进行访问,主流的web服务器有Apache.IIS.Nginx.lighthttpd等.应用服务 ...

  9. delphi 给程序加托盘图标

    一些程序运行时,会在桌面的右下角显示一个图标(任务栏的右边),这类图标称为 托盘.托盘是一个PNotifyIconDataA类型的结构,要增加托盘图标其实就是对结构PNotifyIconDataA的操 ...

  10. Hbase的读写流程

    HBase读写流程 1.HBase读数据流程 HRegionServer保存着meta表以及表数据,要访问表数据,首先Client先去访问zookeeper,从zookeeper里面获取meta表所在 ...