Project tutorial
Remote and Voice Controlled Smart Fan

Remote and Voice Controlled Smart Fan

I use a fan for white noise while sleeping, but sometimes forget to turn it on before getting into bed. This will fix that.

  • 4,172 views
  • 3 comments
  • 19 respects

Components and supplies

Small oscillating tabletop fan
×1
Relay module
×1
Pi 3 02
Raspberry Pi 3 Model B
×1
Speaker with aux connection
×1
USB microphone
×1
Button
I'm using a standard arcade button, but use whatever works for you
×1
A000066 iso both
Arduino UNO & Genuino UNO
×1
1586 00
Adafruit NeoPixel Ring: WS2812 5050 RGB LED
×1

Necessary tools and machines

Wire stripper
Screwdriver

Apps and online services

About this project

Like many people, I sleep with white noise in the background generated by a fan in the bedroom. Often enough I'll forget to turn the fan on until I have already gotten into bed, at which point it quickly becomes a back and forth of "do I really want to get out of bed to turn that on?". Because of this, I have decided to update a fan using Android Things in order to turn the fan on through an app, as I always have a phone near the bed plugged in to the wall at night. In addition, this tutorial goes over adding the Google Assistant to the fan, how to add Google Actions in order to operate the fan through voice commands and how to add the Awareness API from Google Play Services for automating operations.

There are six parts to this project: the IoT fan, the mobile app for controlling the fan, the Google Assistant API, Google Actions, a Firebase backend for managing state data and interacting with Actions, Alexa support and an Arduino for controlling components that Android Things can't support. While the phone app could communicate directly with the fan through bluetooth, I have opted to use Firebase in order to support scaling the project to other platforms later on, such as a web platform or iOS, as well as incorporate other Google services. Another option that could be used in the future is Google's Cloud IoT Core, although, at the time of this writing, it is still in a private beta.

To start, this tutorial will set up the fan to turn on and off with a mobile app. While that's a fun start, we're going to aim for awesome. The next step will be to incorporate Google Assistant into the device, as I currently have a Google Home in the kitchen, but not one in the bedroom, and this seems like a great project for incorporating that technology. After that we will add Actions to the project in order to control the device through the Google Assistant. With that said, let's get started making the initial project!

STEP 1: Fan on/off via app

Setting Up Firebase

Before you can start using Firebase in your applications, you will need to create a Firebase project through Google. You can start by going to Google's Firebase console site and clicking on the blue Create New Project button.

If this is not your first project, you can click on the Add a Project button.

After you hit the blue Create Project button, Firebase will generate your project and take you to the Firebase console screen.

On the main Firebase console screen, you should see three options for adding Firebase to a project. For this project you will want to click on the green Add Firebase to your Android App button.

After you have clicked on the above button, a new dialog will open that will ask you for the necessary information for getting Firebase into your Android application. You will need to enter the package name for your app. Remember the package name that you use, as you will need it when you create your Android Things and mobile apps.

After you select REGISTER APP, you will be prompted to download a file named google-services.json. Go ahead and download that now.

You will want to save this file for when you have created your Android app.

After closing the popup window, you will want to go into the database section of the Firebase console.

Although you can get more complicated with Firebase, for this demo we will assume only one fan for now. The database will have a smartfan root object with key/value boolean pairs representing the on/off state of the fan and if auto-on is enabled.

At the top of this screen you should see a set of tabs. Next, you will want to go into the RULES tab.

Typically you would want to keep the rules as they are in order to keep your app data secure, though for this tutorial we will ignore authentication to keep things focused on the IoT device. Set both of these rules to be auth == null .

Android Setup

Next you will want to create your Android projects. For this project we will be using the Android Studio 3.0 preview IDE, as it allows you to create Android Things projects through a template. Start by opening the program and creating Start a new Android Studio project.

After you have named your application and picked a directory for it, you can select the Phone and Tablet template and the Android Things template, like so

You can click next through all of the remaining screens. Once your project has created, you may need to change some files in order to handle issues with the templates. Under the build.gradle file in both the mobile and things modules, update the SDK version information to match the following

compileSdkVersion 26
buildToolsVersion "26.0.1"
defaultConfig {
 applicationId "com.ptrprograms.smartfan"
 minSdkVersion 26
 targetSdkVersion 26

You may also need to go into the project level build.gradle file and change the classpath string to the following: classpath 'com.android.tools.build:gradle:2.3.3'

Once your project syncs and is no longer displaying errors, you will want to copy the google-services.json file from Firebase into the root of both the things and mobile modules.

There are also additions that will need to be made to all three build.gradle files, as detailed on one of the initial Firebase screens.

Finally, in both the things and mobile build.gradle files, add the following dependency under the dependencies node

compile 'com.google.firebase:firebase-database:11.4.0'

Next we will focus on creating the IoT fan hardware.

Creating the IoT Fan

For this project, I took a simple oscillating table top fan and removed the casing from it by first removing the back.

and then the rest of the casing

The top consisted of five buttons: three that controlled the speed of the main fan motor, one that controlled oscillation, and the final turned the fan off. I removed the top and cut each of the wires.

Next, I had to figure out which wire controlled what within the fan. I figured the brown wire on the end of the button strip acted as the main source of power, so I connected that to the other wires in order to see what started on the fan. This is definitely a terrible way to test things, and most likely incredibly dangerous, so you probably shouldn't do things this way ¯\_(ツ)_/¯

At this point we know that the brown wire touching the black wire leads to the fan oscillating, and connecting the brown wire to any of the other three will turn the fan at various speeds. For this project we will only use one speed wire.

Since we want to add a Raspberry Pi to control the fan, and those run with very low voltages, we will need to use a relay switch to connect these wires with our on/off logic. Relay switches sit in one position when no signal voltage is applied to them, and then switch to a different position when voltage is applied to the signal pin by creating a small electromagnetic field to pull interior foil to create a new connection. We will connect the brown wire to the COM port of one relay. While you may use the blue, red or white wires for turning the fan on, you will need only one. For this project I simply went with the white wire. Connect this to the NO (which stands for Normally Open) connector on your relay.

Now when you apply power to the signal pins on the relay, the fan will turn on. We will connect the VCC pin to our Raspberry Pi's 5V pin, GND to GND, the fan on/off relay signal to pin BCM18, as found on the chart below.

In addition, we will want to prepare for the Google Assistant portion of this tutorial. You will need to connect a USB microphone to one of the USB ports on the Raspberry Pi, a speaker to the AUX jack, and a button with the signal going into BCM23. If you are using an arcade button, like I am, you will want to make sure that it has a pole for normally-open and normally-closed so you can know when the button is pressed and released.

While you could use the AIY Voice Hat from Google for this, I am using a standard speaker and USB microphone because they are more readily available for anyone else working on this project. If you have a Voice Hat and want to use it, go right ahead.

At this point you can reassemble the fan casing, though I will be leaving it off of my project until the end for easier access and visuals.

Mobile App

The mobile app will have one purpose: controlling the states of the fan. Under your mobile module, create a new class and name it SmartFan. This will be the data object that is used to communicate with Firebase. This class will have two booleans representing the fan's on/off state and oscillating state, which matches to the values we set in Firebase earlier.

public class SmartFan {
private boolean fanOn;
    private boolean autoOn;
    public SmartFan() {
   }
public SmartFan(boolean fanOn, boolean autoOn) {
 this.fanOn = fanOn;
        this.autoOn = autoOn;
 }
public boolean isAutoOn() {
 return autoOn;
 }
public void setAutoOn(boolean autoOn) {
 this.autoOn = autoOn;
 }
public boolean isFanOn() {
 return fanOn;
 }
public void setFanOn(boolean fan_state) {
 this.fanOn = fan_state;
 }
}

After you have created your data object, open activity_main.xml in order to set up your layout for the mobile app. You will want to insert the following code:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
 xmlns:tools="http://schemas.android.com/tools"
 android:layout_width="match_parent"
 android:layout_height="match_parent"
 android:orientation="vertical"
 tools:context="com.ptrprograms.smartfan.MainActivity">
    <Button
 android:id="@+id/fan_state"
 android:layout_width="wrap_content"
 android:layout_height="wrap_content"
 android:text="Toggle Fan" />
    <Button
 android:id="@+id/auto_on_state"
 android:layout_width="wrap_content"
 android:layout_height="wrap_content"
 android:text="Toggle Auto On"/>
</LinearLayout>

The above class will simply display two buttons for toggling the fan's states.

Next, open the MainActivity.java class. This is where your programming logic for interacting with Firebase will take place. To start, add the following member variables to the top of the class:

private static final String FAN_URL = "https://smart-fan.firebaseio.com/";
private Button fanStateButton;
private Button autoOnFanStateButton;
private DatabaseReference databaseRef;
private SmartFan smartFan;

where your FAN_URL matches the URL from your Firebase project.

You will then want to initialize your DatabaseReference in onCreate() and add a listener for when values in Firebase change.

databaseRef = FirebaseDatabase.getInstance().getReferenceFromUrl(FAN_URL);
databaseRef.addValueEventListener(new ValueEventListener() {
 @Override
 public void onDataChange(DataSnapshot dataSnapshot) {
 smartFan = dataSnapshot.getValue(SmartFan.class);
 }
@Override
 public void onCancelled(DatabaseError databaseError) {
   }
});

Finally, create references to your Button layout objects and add click listeners that update the SmartFan object in Firebase when they are pressed.

fanStateButton = (Button) findViewById(R.id.fan_state);
fanStateButton.setOnClickListener(new View.OnClickListener() {
 @Override
 public void onClick(View view) {
 smartFan.setFanOn(!smartFan.isFanOn());
 databaseRef.setValue(smartFan);
 }
});
autoOnFanStateButton = (Button) findViewById(R.id.auto_on_state);
autoOnFanStateButton.setOnClickListener(new View.OnClickListener() {
 @Override
 public void onClick(View view) {
 smartFan.setAutoOn(!smartFan.isAutoOn());
 databaseRef.setValue(smartFan);
 }
});

Once you are done filling in the onCreate() method in your mobile application, you are done with it. Run it on a device and press the buttons while you have the Firebase database console open. You should see the values flash yellow and change as you interact with the app.

Creating the Android Things App

Before you can start controlling an Android Things powered device, you must flash Android Things onto the board that you will be using. For this project, I am using a Raspberry Pi 3B (because they're the only Things supported boards I have right now :p), though you could easily use any other Things device. For instructions on flashing Android Things onto a Raspberry Pi SD card, see the following official documentation.

Once you are connected to your Android Things Pi, it's time to start writing the Android app that will run on it. You will first need to copy the SmartFan object from the mobile module into the things module.

Next, create the member variables that will be needed for your device

private static final String FAN_URL = "https://smart-fan.firebaseio.com/";
private static final String FAN_STATE_GPIO_PIN = "BCM18";
private DatabaseReference databaseRef;
private Gpio fanStateSignal;
private SmartFan smartFanStates;

where FAN_URL is again your Firebase URL for your project. Once you have your member variables, you will update onCreate() to connect to Firebase in order to pull down and listen to changed values in your database, as well as control signals to your fan's relay.

@Override
protected void onCreate(Bundle savedInstanceState) {
 super.onCreate(savedInstanceState);
 initFirebase();
 initFanSignals();
}

In the above initFanSignals() is where we will initialize our GPIO for the fan relay.

private void initFanSignals() {
    PeripheralManagerService service = new PeripheralManagerService();
    try {
 fanStateSignal = service.openGpio(FAN_STATE_GPIO_PIN);
 fanStateSignal.setDirection(Gpio.DIRECTION_OUT_INITIALLY_LOW);
 } catch( IOException e ) {
   }
}

and initFirebase() is declared as

private void initFirebase() {
 databaseRef = FirebaseDatabase.getInstance().getReferenceFromUrl(FAN_URL);
 databaseRef.addValueEventListener(new ValueEventListener() {
 @Override
 public void onDataChange(DataSnapshot dataSnapshot) {
 smartFanStates = dataSnapshot.getValue(SmartFan.class);
            try {
 fanStateSignal.setValue(smartFanStates.isFanOn());
 } catch( IOException e ) {
           }
        }
@Override
 public void onCancelled(DatabaseError databaseError) {
       }
    });
}

Finally, create an onDestroy() method where you can clean up all connections if the app ever has a situation where it needs to close.

@Override
protected void onDestroy() {
    super.onDestroy();
    if( fanStateSignal != null ) {
        try {
            fanStateSignal.setValue(false);
            fanStateSignal.close();
            fanStateSignal = null;
        } catch( IOException e ) {
        }
    }
}

Once you have your code set up, you should be able to install the Things app onto your Raspberry Pi and open your app on a mobile phone (or change the database values directly in Firebase). When you do this, the fan should turn on and off appropriately.

STEP 2: Google Assistant

Now that the fan can be turned on and off, let's make it more exciting. Google Assistant lets you add a lot of voice functionality to a device, though in this project we'll keep it simple and just add the same functionality that you would get from a Google Home device. The end goal here will be to have the ability to wake up in the morning and say "good morning" for your Google Assistant response. All code for this section will occur in the things module of your app, though we will start by creating the Google Assistant credentials. You can find documentation for creating your credentials and enabling the Google Assistant API in the official sample here, so go ahead and get that set up before moving back to your Android app. The credentials.json file that gets generated during this process will go into the res/raw directory of the things module.

After your credentials are created with Google, you will need to declare some permissions for your app. Open the AndroidManifest.xml file and add the following lines within the manifest tag, but before the application tag.

<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="com.google.android.things.permission.MANAGE_AUDIO_DRIVERS" />

It's worth noting that you will need to restart your device after installing the app with these permissions in order for them to be granted.

Next you will need to copy the gRPC module into your app for communicating with the home device. This gets a little tricky, so the best place to get it is from the Google Assistant Android Things sample app, which can be found here. You will then need to update your settings.gradle file to reflect the new module.

include ':mobile', ':things', ':grpc'

After updating settings.gradle, include the module as a dependency in the things module by including the following line in the things build.gradle file and include Google's button driver.

compile project(':grpc')
compile 'com.google.android.things.contrib:driver-button:0.4'

and include protobuf as a dependency in your project level build.gradle file.

classpath "com.google.protobuf:protobuf-gradle-plugin:0.8.0"

At this point your project should sync and compile. Your modules on the side of Android Studio may look like this:

Next, let's include the oauth2 library in our project by opening the things module's build.gradle file and adding the following under the dependencies node:

compile('com.google.auth:google-auth-library-oauth2-http:0.6.0') {
    exclude group: 'org.apache.httpcomponents', module: 'httpclient'
}

You may run into conflicts here if your project has the Espresso dependency with an error message similar to this:

Warning:Conflict with dependency 'com.google.code.findbugs:jsr305' in project ':things'. Resolved versions for app (1.3.9) and test app (2.0.1) differ. See http://g.co/androidstudio/app-test-app-conflict for details. 

If so, just remove the Espresso dependency from build.gradle.

After you have synced your project, create a new class named Credentials.java to access your credentials.

public class Credentials {
 static UserCredentials fromResource(Context context, int resourceId)
 throws IOException, JSONException {
        InputStream is = context.getResources().openRawResource(resourceId);
        byte[] bytes = new byte[is.available()];
 is.read(bytes);
 JSONObject json = new JSONObject(new String(bytes, "UTF-8"));
        return new UserCredentials(
                json.getString("client_id"),
 json.getString("client_secret"),
 json.getString("refresh_token")
        );
 }
}

This will pull the credentials from your credentials.json file when they are needed by your app. Now that the initial setup is done, go ahead and open up MainActivity.java again and start adding the code to support the Google Assistant. You can start by adding the following member variables to the top of the class.

private static final String ASSISTANT_BUTTON_GPIO_PIN = "BCM23";
private Button mButton;
private static final String ASSISTANT_ENDPOINT = "embeddedassistant.googleapis.com";
private static final int BUTTON_DEBOUNCE_DELAY_MS = 20;
private static final String PREF_CURRENT_VOLUME = "current_volume";
private static final int SAMPLE_RATE = 16000;
private static final int ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private static final int DEFAULT_VOLUME = 100;
private static final int SAMPLE_BLOCK_SIZE = 1024;
private AudioTrack mAudioTrack;
private AudioRecord mAudioRecord;
private int mVolumePercentage = DEFAULT_VOLUME;
private ByteString mConversationState = null;
private HandlerThread mAssistantThread;
private Handler mAssistantHandler;
private static AudioInConfig.Encoding ENCODING_INPUT = AudioInConfig.Encoding.LINEAR16;
private static AudioOutConfig.Encoding ENCODING_OUTPUT = AudioOutConfig.Encoding.LINEAR16;
private EmbeddedAssistantGrpc.EmbeddedAssistantStub mAssistantService;
private StreamObserver<ConverseRequest> mAssistantRequestObserver;

The above consists of a Button object and it's pin name, objects necessary for threading, audio recording/playback and the Assistant API. Below these member variables, you will need to create AudioFormat objects that will be used later in this class.

private static final AudioFormat AUDIO_FORMAT_STEREO =
 new AudioFormat.Builder()
                .setChannelMask(AudioFormat.CHANNEL_IN_STEREO)
                .setEncoding(ENCODING)
                .setSampleRate(SAMPLE_RATE)
                .build();
private static final AudioFormat AUDIO_FORMAT_OUT_MONO =
 new AudioFormat.Builder()
                .setChannelMask(AudioFormat.CHANNEL_OUT_MONO)
                .setEncoding(ENCODING)
                .setSampleRate(SAMPLE_RATE)
                .build();
private static final AudioFormat AUDIO_FORMAT_IN_MONO =
 new AudioFormat.Builder()
                .setChannelMask(AudioFormat.CHANNEL_IN_MONO)
                .setEncoding(ENCODING)
                .setSampleRate(SAMPLE_RATE)
                .build();

For the final values at the top of the class, you will need to create objects related to requests and responses from Google Assistant. These four objects actually do a lot, but for now you can just note that they are used for handling recorded audio and playing responses.

private StreamObserver<ConverseResponse> mAssistantResponseObserver =
 new StreamObserver<ConverseResponse>() {
 @Override
 public void onNext(ConverseResponse value) {
 switch (value.getConverseResponseCase()) {
 case EVENT_TYPE:
 break;
                    case RESULT:
 final String spokenRequestText = value.getResult().getSpokenRequestText();
 Log.e("Test", "spokenrequesttext: " + spokenRequestText);
 mConversationState = value.getResult().getConversationState();
                        if (value.getResult().getVolumePercentage() != 0) {
 mVolumePercentage = value.getResult().getVolumePercentage();
                            float newVolume = AudioTrack.getMaxVolume() * mVolumePercentage / 100.0f;
 mAudioTrack.setVolume(newVolume);
 SharedPreferences.Editor editor = PreferenceManager.
 getDefaultSharedPreferences(MainActivity.this).edit();
 editor.putFloat(PREF_CURRENT_VOLUME, newVolume);
 editor.apply();
 }
 break;
                    case AUDIO_OUT:
 final ByteBuffer audioData =
                                ByteBuffer.wrap(value.getAudioOut().getAudioData().toByteArray());
 mAudioTrack.write(audioData, audioData.remaining(), AudioTrack.WRITE_BLOCKING);
                        break;
                    case ERROR:
 break;
 }
            }
@Override
 public void onError(Throwable t) {}
@Override
 public void onCompleted() {}
        };
private Runnable mStartAssistantRequest = new Runnable() {
 @Override
 public void run() {
 mAudioRecord.startRecording();
 mAssistantRequestObserver = mAssistantService.converse(mAssistantResponseObserver);
 ConverseConfig.Builder converseConfigBuilder =
                ConverseConfig.newBuilder()
                        .setAudioInConfig(AudioInConfig.newBuilder()
                                .setEncoding(ENCODING_INPUT)
                                .setSampleRateHertz(SAMPLE_RATE)
                                .build())
                        .setAudioOutConfig(AudioOutConfig.newBuilder()
                                .setEncoding(ENCODING_OUTPUT)
                                .setSampleRateHertz(SAMPLE_RATE)
                                .setVolumePercentage(mVolumePercentage)
                                .build());
        if (mConversationState != null) {
            converseConfigBuilder.setConverseState(
                    ConverseState.newBuilder()
                            .setConversationState(mConversationState)
                            .build());
 }
mAssistantRequestObserver.onNext(ConverseRequest.newBuilder()
                .setConfig(converseConfigBuilder.build())
                .build());
 mAssistantHandler.post(mStreamAssistantRequest);
 }
};
private Runnable mStreamAssistantRequest = new Runnable() {
 @Override
 public void run() {
        ByteBuffer audioData = ByteBuffer.allocateDirect(SAMPLE_BLOCK_SIZE);
        int result =
 mAudioRecord.read(audioData, audioData.capacity(), AudioRecord.READ_BLOCKING);
        if (result < 0) {
 return;
 }
 mAssistantRequestObserver.onNext(ConverseRequest.newBuilder()
                .setAudioIn(ByteString.copyFrom(audioData))
                .build());
 mAssistantHandler.post(mStreamAssistantRequest);
 }
};
private Runnable mStopAssistantRequest = new Runnable() {
 @Override
 public void run() {
 mAssistantHandler.removeCallbacks(mStreamAssistantRequest);
        if (mAssistantRequestObserver != null) {
 mAssistantRequestObserver.onCompleted();
 mAssistantRequestObserver = null;
 }
 mAudioRecord.stop();
 mAudioTrack.play();
 }
};

In onCreate() you will have a few more things that need to be completed. Update the method to look like the following:

@Override
protected void onCreate(Bundle savedInstanceState) {
 super.onCreate(savedInstanceState);
 initAssistantThread();
 initAssistantButton();
 initAudio();
 initFirebase();
 initFanSignals();
}

Here initAssistantThread() simply initializes the background thread used for the Assistant API's requests

private void initAssistantThread() {
 mAssistantThread = new HandlerThread("assistantThread");
 mAssistantThread.start();
 mAssistantHandler = new Handler(mAssistantThread.getLooper());
}

initAssistantButton() is similar to initFanSignals() in that it initializes the GPIO for the button used to trigger the microphone through the Android Things Peripherals API

private void initAssistantButton() {
 try {
 mButton = new Button(ASSISTANT_BUTTON_GPIO_PIN, Button.LogicState.PRESSED_WHEN_HIGH);
 mButton.setDebounceDelay(BUTTON_DEBOUNCE_DELAY_MS);
 mButton.setOnButtonEventListener(this);
 } catch (IOException e) {
 return;
 }
}

Finally, initAudio() initializes the Assistant API by checking credentials and setting up audio input and output on your Android Things board.

private void initAudio() {
    AudioManager manager = (AudioManager)this.getSystemService(Context.AUDIO_SERVICE);
    int maxVolume = manager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
 manager.setStreamVolume(AudioManager.STREAM_MUSIC, mVolumePercentage * maxVolume / 100, 0);
    int outputBufferSize = AudioTrack.getMinBufferSize(AUDIO_FORMAT_OUT_MONO.getSampleRate(),
 AUDIO_FORMAT_OUT_MONO.getChannelMask(),
 AUDIO_FORMAT_OUT_MONO.getEncoding());
 mAudioTrack = new AudioTrack.Builder()
            .setAudioFormat(AUDIO_FORMAT_OUT_MONO)
            .setBufferSizeInBytes(outputBufferSize)
            .build();
 mAudioTrack.play();
    int inputBufferSize = AudioRecord.getMinBufferSize(AUDIO_FORMAT_STEREO.getSampleRate(),
 AUDIO_FORMAT_STEREO.getChannelMask(),
 AUDIO_FORMAT_STEREO.getEncoding());
 mAudioRecord = new AudioRecord.Builder()
            .setAudioSource(MediaRecorder.AudioSource.MIC)
            .setAudioFormat(AUDIO_FORMAT_IN_MONO)
            .setBufferSizeInBytes(inputBufferSize)
            .build();
 SharedPreferences preferences = PreferenceManager.getDefaultSharedPreferences(this);
    float initVolume = preferences.getFloat(PREF_CURRENT_VOLUME, maxVolume);
 mAudioTrack.setVolume(initVolume);
 mVolumePercentage = Math.round(initVolume * 100.0f / maxVolume);
 ManagedChannel channel = ManagedChannelBuilder.forTarget(ASSISTANT_ENDPOINT).build();
    try {
 mAssistantService = EmbeddedAssistantGrpc.newStub(channel)
                .withCallCredentials(MoreCallCredentials.from(
                        Credentials.fromResource(this, R.raw.credentials)
                ));
 } catch (IOException|JSONException e) {}
}

Now that initializations are set up, you will need to implement the Button.OnButtonEventListener interface on MainActivity.java by updating the top class line to the following:

public class MainActivity extends Activity implements Button.OnButtonEventListener {

and overriding the onButtonEvent() method to start and stop audio recordings for the Assistant.

@Override
public void onButtonEvent(Button button, boolean pressed) {
 if (pressed) {
 mAssistantHandler.post(mStartAssistantRequest);
 } else {
 mAssistantHandler.post(mStopAssistantRequest);
 }
}

Finally, update onDestroy() to include the following cleanup code

if (mAudioRecord != null) {
 mAudioRecord.stop();
 mAudioRecord = null;
}
if (mAudioTrack != null) {
 mAudioTrack.stop();
 mAudioTrack = null;
}
if (mButton != null) {
 try {
 mButton.close();
 } catch (IOException e) {}
 mButton = null;
}
mAssistantHandler.post(new Runnable() {
 @Override
 public void run() {
 mAssistantHandler.removeCallbacks(mStreamAssistantRequest);
 }
});
mAssistantThread.quitSafely();

At this point you should be able to go to your fan in the morning and say "good morning" for an update on your day.

STEP 3: Voice Controls with Google Actions/API.AI

What's a smart device in the home without Google Home Actions? Let's go ahead and make this fan even better by being able to enable it with voice. The first thing you will need to do is set up Firebase to support a new webhook and upload a program to Firebase to handle Google Actions in Node.js. Once that's done, you will create voice actions using Google's API.AI service and tie it together with your Firebase function. Let's start by setting up Firebase Functions.

First, you will need Node.js installed on your computer. I'm running OSX with Homebrew installed, so I did this with the command

brew install npm

After you have Node.js installed, go ahead and install the Actions SDK for Node.js with the command npm i actions-on-google

When that finishes, you will need to install Firebase Tools. In a terminal, you can do this with the command npm install -g firebase-tools

This will start the installation process, and should look similar to the following

Next, run the command firebase login in order to log into your Google account for your device. This will open a browser and allow you to log into your GMail account. Select your email and click on the blue Allow button to proceed.

Now that you are authenticated, go into your terminal and ensure that you are in the directory for your Firebase backend code. Once there, type the command firebase init functions

You will then be prompted to select one of your Firebase projects. In this case, I am using the project created for the fan: smart-fan.

Follow the next prompt to install any dependencies, and then Firebase should finish initializing. Within your project folder you should see another directory named functions. Go into this directory and open the file index.js in a text editor. Paste the following code into this file and save it.

'use strict'; 
process.env.DEBUG = 'actions-on-google:*'; 
const functions = require('firebase-functions'); 
// The Firebase Admin SDK to access the Firebase Realtime Database.  
const admin = require('firebase-admin'); 
admin.initializeApp(functions.config().firebase); 
const Assistant = require('actions-on-google').ApiAiAssistant; 
const ACTION_FAN = 'fan'; 
exports.fanControl = functions.https.onRequest((req, res) => { 
	const assistant = new Assistant({request: req, response: res}); 
	function fanControl (assistant) { 
		var ref = admin.database().ref('/'); 
		if( req.body.result.parameters.control.toString() === 'on' ) { 
			ref.update({"fanOn": true}); 
		} else { 
			ref.update({"fanOn": false}); 
		} 
	} 
	const actionMap = new Map(); 
	actionMap.set(ACTION_FAN, fanControl); 
	assistant.handleRequest(actionMap); 
}); 

The above code listens for an HTTPS request, creates a reference to the Google Assistant, and when an action with a value of fan is received, it will check the associated JSON data for a key named control to see if the value is on or off. Once this value is determined, the function will update Firebase to turn the fan on or off, appropriately.

Returning to your terminal, enter the command firebase deploy --only functions to upload this function to Firebase. When that finishes, you will be given a link to the function. Make note of this, as you will need it soon.

Next, go to the API.AI console and create a new agent by entering a description and other information, and selecting your smart fan Google project. Once your agent is created, click on Fulfillment in the left column. Enter the URL for your Firebase function and make sure Enable webhooks for all domains from the DOMAINS dropdown is selected.

Next, select Entities from the left column. Entities are the key words that will be used by Google Actions to determine if an action can be performed. Click on the blue CREATE ENTITY button at the top of the screen to get started. On the next screen, enter a name for the entity class and create two entries: on and off. These will be the values that are passed to your Firebase function in the JSON data. You can add additional words for each entry, such as start and stop, to add variation for users. Once you're done, click on the blue SAVE button.

After your entities are created, open the Intents section and click on CREATE INTENT. Under the Action section, set the value to fan , as this is the keyword that your Firebase function is looking for. You will also want to start adding sentences to the Add user expressions area that the user can say to control the fan. Keywords, such as on and stop, will be highlighted as entities and the agent will train itself after each entry. Towards the bottom of the screen you will want to click on Fulfillment and check the Use webhooks checkbox.

Finally, click on the Integrations item in the left column, and enable Actions on Google.

At this point you should be able to start testing your responses. You can type into the test area on the right side of the screen to see what the responses are. Go ahead and type "start the fan" and you should see the value in Firebase change. If you type "stop the fan", Firebase should reflect the false value in fanOn .

Testing with a real device gets a bit more tricky. You will want to go the actions console, go into the simulator and enable testing. I did this process while poking around at things, so can't give you more specific steps (because I can't reenable it :P), but you should be able to speak to any Google Assistant device associated with your Google account to start a conversation with your fan.

Since you are in a conversation mode with your device, you can also add additional small talk features in the API.AI, add alarm intents or any other actions that would be useful for your fan. This is a good opportunity to play with what's available and make something cool.

STEP 4: Auto-On

You may notice that auto-on has been included in the data. Since the root problem that started this project is that I often forget to turn on the fan before getting into bed, I thought it would be good if the fan just automatically turned on for me. In order to make this happen, I plan to use the Awareness API from Google Play Services to keep track of time and automatically turn on if the autoOn value is set to true in Firebase.

You will first need to enable the Awareness API in the Google Cloud Console by following this link and clicking the blue Manage button, and then Enable. Next, you will need to generate an API key for your Android Things device. Click on the credentials item in the left navigation column, click on the blue Create credentials dropdown and select API key. This will bring up a dialog where you can copy your new API key.

Now that you have an API key and the Awareness API enabled, go into the AndroidManifest.xml file for your Things module and add the following into the application node.

<meta-data
    android:name="com.google.android.awareness.API_KEY"
    android:value="API_KEY_HERE"/>

You will also need to include the ACCESS_FINE_LOCATION permission with your other permissions in this manifest in order to use time with the Awareness API.

<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />

For this example, we will only use the time tools in the Awareness API, though there would be a few more steps here if you use the Places API or the Nearby API for beacons.

You will also need to include the Awareness API in your module by going into the build.gradle file and including the Google Play Services awareness library under the dependencies node.

compile 'com.google.android.gms:play-services-awareness:11.4.0'

Next, go into the MainActivity.java file in your Things module. The rest of the setup will occur in this class. You will need to include some member variables at the top of your class.

private static final String TIME_FENCE_KEY = "time_fence_key";
private final String FENCE_RECEIVER_ACTION = "FENCE_RECEIVER_ACTION";
private GoogleApiClient googleApiClient;
private PendingIntent pendingIntent;
private FenceReceiver fenceReceiver;
private static final int START_WINDOW_IN_MILLIS = 36000000;
private static final int END_WINDOW_IN_MILLIS = 75600000;

The first two values are keys for keeping track of your Awareness API fence and the Awareness API intent. After that you have the Google API Client (which is the key to a lot of other Google Play Services tools), a PendingIntent to use with the Awareness API and a new inner class reference named FenceReceiver that we will define later in this section. The final two values are used to create a window of time where the fan will attempt to turn on automatically. For testing, I have set this for 10AM to 9PM.

In the class name line of your application, you will need to implement the GoogleApiClient.ConnectionCallbacks interface by changing that line to the following:

public class MainActivity extends Activity implements Button.OnButtonEventListener, GoogleApiClient.ConnectionCallbacks {

Next, in onCreate() , add a call to a new method called initPlayServices() . This method will initialize the GoogleApiClient, declare that it uses the Awareness API and then connects to Google Play Services.

private void initPlayServices() {
 googleApiClient = new GoogleApiClient.Builder(this)
            .addApi(Awareness.API)
            .addConnectionCallbacks(this)
            .build();
}

Once the API client connects (though we're not connecting it above), the onConnected() method that is required by the GoogleApiClient.ConnectionCallbacks interface will be called. This is where you will initialize the PendingIntent that triggers your BroadcastReceiver , create an instance of your FenceReceiver and call a method to setup your new fence.

@Override
public void onConnected(@Nullable Bundle bundle) {
    Intent intent = new Intent(FENCE_RECEIVER_ACTION);
 pendingIntent =
            PendingIntent.getBroadcast(MainActivity.this, 0, intent, 0);
 fenceReceiver = new FenceReceiver();
 registerReceiver(fenceReceiver, new IntentFilter(FENCE_RECEIVER_ACTION));
 setupFences();
}

The setupFences() method will create a new AwarenessFence using the TimeFence class and associate it with the FenceApi and your PendingIntent for FenceReceiver . If you wanted to make the fence more detailed, you could create additional AwarenessFence objects (like the TimeFence ) and append them together with the AwarenessFence.and or or operations here.

@SuppressWarnings({"MissingPermission"})
private void setupFences() {
    AwarenessFence timeFence = TimeFence.inDailyInterval(TimeZone.getDefault(), START_WINDOW_IN_MILLIS, END_WINDOW_IN_MILLIS);
 Awareness.FenceApi.updateFences(
 googleApiClient,
            new FenceUpdateRequest.Builder()
                    .addFence(TIME_FENCE_KEY, timeFence, pendingIntent)
                    .build());
}

You'll notice that we also have to annotate this method with SuppressWarnings for MissingPermissions . This is because we aren't asking the user to manually grant permissions, since Android Things handles that for us, but the compiler treats this as an error.

Let's go ahead and create our inner class FenceReceiver . This method will listen for the PendingIntent that was created earlier, and check to see if the AwarenessFence is true before trying to turn on the fan.

public class FenceReceiver extends BroadcastReceiver {
@Override
 public void onReceive(Context context, Intent intent) {
        FenceState fenceState = FenceState.extract(intent);
        if (TextUtils.equals(fenceState.getFenceKey(), TIME_FENCE_KEY)) {
 if( fenceState.getCurrentState() == FenceState.TRUE && (smartFanStates != null && smartFanStates.isAutoOn()) ) {
                turnOnFan();
 }
        }
    }
}

Here turnOnFan() works similarly to how the mobile app activates the fan through Firebase

private void turnOnFan() {
 smartFanStates.setFanOn(true);
 databaseRef.setValue(smartFanStates);
}

To start connecting, go into onDataChange() where you receive your fan state data, and save the autoOn value. You will then check if the GoogleApiClient is already connected or connecting, and if it is not, you will connect it. This must be done here because the end goal operation requires the data from Firebase, which isn't guaranteed to be pulled down before the GoogleApiClient has connected.

@Override
public void onDataChange(DataSnapshot dataSnapshot) {
smartFanStates = dataSnapshot.getValue(SmartFan.class);
    try {
 fanStateSignal.setValue(smartFanStates.isFanOn());
 } catch( IOException e ) {
   }
if( !googleApiClient.isConnected() && !googleApiClient.isConnecting() ) {
 googleApiClient.connect();
 }
}

To wrap things up, there's some teardown that needs to occur as the app closes by removing the fence and disconnecting from Google Play Services. This isn't as important as it would be on a mobile device, as Android Things devices only run one app at a time, but it's still a good practice.

@Override
protected void onStop() {
    Awareness.FenceApi.updateFences(
 googleApiClient,
            new FenceUpdateRequest.Builder()
                    .removeFence(TIME_FENCE_KEY)
                    .build());
 googleApiClient.disconnect();
    if (fenceReceiver != null) {
        unregisterReceiver(fenceReceiver);
 }
super.onStop();
}

Note: Depending on when you make this project, you may run into issues with Android Things not supporting the latest Play Services (I had set everything to 11.4.0). At the time of this writing, the Things device only supports 11.0.0, so you may need to change your dependencies.

Fan turning on automatically during time window

STEP 5: Alexa Support

While Google Home is a great addition, a lot of people also have Amazon Echo devices. Because of this, I decided to try my hand at adding Alexa support to the fan. I don't personally own an Echo, but I was able to use the simulator on the AWS console.

The first thing that you will need to do is go to the Amazon developer console and create an account. Once you have an account, you can search for 'lambda' to pull up the Lambda Console.

Once you've landed on the Lambda Console, you can create a new function by clicking on the Create function button in the top right corner.

On this screen you can select Author from scratch and enter a name for your new lambda function. You will also need to select a role for your function. For this, I selected to create a new role from the dropdown menu.

With your role created, you can select it on the configuration screen and then select Create function from the lower right corner.

Now it's time to add some code. You should have been presented with a simple web based text editor for adding your lambda function. Since this fan is driven by values in Firebase, our lambda only needs to know if it should turn the fan on or off, and then set those values in Firebase. While we were able to use the NodeJS SDK for Firebase with Google Actions, we will need to use the REST API here to update those values. The following method will allow you to specify a key and a value that will be placed into the Firebase database.

var https = require('https'); 
var firebaseHost = "smart-fan.firebaseio.com"; 
function fbPut(key, value){ 
 return new Promise((resolve, reject) => { 
   var options = { 
     hostname: firebaseHost, 
     port: 443, 
     path: key + ".json", 
     method: 'PUT' 
   }; 
   var req = https.request(options, function (res) { 
     console.log("request made") 
     res.setEncoding('utf8'); 
     var body = ''; 
     res.on('data', function(chunk) { 
       body += chunk; 
     }); 
     res.on('end', function() { 
       resolve(body) 
     }); 
   }); 
   req.end(JSON.stringify(value)); 
   req.on('error', reject); 
 }); 
} 

In order to use the above method, you will need to check the header values that are passed to your lambda function to determine if the fan should be turned on or off. You can do this by setting the entry point for your function to the following.

exports.handler = function (event, context, callback) { 
   if( event.header.name === "TurnOnRequest" ) { 
       fbPut("/fanOn", true); 
   } else if( event.header.name === "TurnOffRequest" ) { 
       fbPut("/fanOn", false); 
   } 
}; 

When those two blocks are available and saved in your lambda method, it's time to test. On the top right of the screen you'll notice a test button and a dropdown menu. In the dropdown, select Configure test event. On the next screen, click the Create new test event radio button and in the Event template dropdown scroll to the Alexa header and select Alexa Smart Home - Turn On.

Once you select the event type, you should see the test response JSON on the testing screen.

You will want to do the same with Alexa Smart Home - Turn Off.

You should be able to select one of the new events from the dropdown on the main screen and click the Test button

which will cause a log to appear at the bottom of the screen as the lambda method runs to completion.

If you watch the Firebase console, you will also notice the value change to true/false depending on the Alexa event.

STEP 6: Arduino + NeoPixel Ring

One problem with Android Things is that the smallest increment of time that it can work in is milliseconds, but a large number of electronics components require operations in the microseconds range. One of these components is the neopixel LED ring. This ring will allow the fan to display various states for the Google Assistant or on/off states. To use this ring, we will need to add another MCU to our device that will communicate back to the Android Things board through a UART connection, and then handle the state of the ring based on input from the core device. To do this, we will add an Arduino UNO to our project.

The first thing we will need to do is open up the communication channel between our Android Things project and our Arduino Uno. In our Things module, we will initialize a UART connection to our Arduino like so:

private void initArduino() {
    PeripheralManagerService service = new PeripheralManagerService();
    try {
        arduinoDevice = service.openUartDevice("UART0");
        arduinoDevice.setBaudrate(9600);
        arduinoDevice.setDataSize(8);
        arduinoDevice.setParity(UartDevice.PARITY_NONE);
        arduinoDevice.setStopBits(1);
        arduinoDevice.registerUartDeviceCallback(new UartDeviceCallback() {
            @Override
            public boolean onUartDeviceDataAvailable(UartDevice uart) {
            byte[] buffer = new byte[1];
                try {
                    uart.read(buffer, 1);
                    Log.e("Test", "received code: " + new String(buffer, Charset.forName("UTF-8")));
                    } catch( IOException e ) {
                    }   
            return super.onUartDeviceDataAvailable(uart);
          }});
     } catch( IOException e ) {
        Log.e("Test", "error on initializing arduino");
     }
}

The above will configure our UART connection and add a callback for anything that is sent back. We won't do anything with the callback, but it is included for completeness. When we want to write to the Arduino, we can use a new helper method defined as

private void writeCodeToArduino(char code) {
 try {
     byte[] data = new byte[1];
     data[0] = (byte) code;
     arduinoDevice.write(data, data.length);
 } catch(IOException e) {
     Log.e("Test", "exception writing to Arduino");
 }
}

We will use this when the state of the fan is set, and when a conversation starts or finishes.

@Override
public void onRequestStart() {
    writeCodeToArduino(LED_RING_CODE_CONVO_START);
}
...
@Override
public void onConversationFinished() {
 if( smartFanStates.isFanOn() ) {
        writeCodeToArduino(LED_RING_CODE_ON);
 } else {
        writeCodeToArduino(LED_RING_CODE_OFF);
 }
}
...

Where the codes are defined as constants at the top of the class.

private static final char LED_RING_CODE_CONVO_START = 'S';
private static final char LED_RING_CODE_OFF = 'F';
private static final char LED_RING_CODE_ON = 'O';

Once your Android Things app is updated, it's time to update your Arduino program. This program will listen for a code byte from the Things board, and will respond by changing the LED ring appropriately.

#include <SoftwareSerial.h> 
#include <Adafruit_NeoPixel.h> 
#define PIN 6 
#define NUMPIXELS      24 
Adafruit_NeoPixel pixels = Adafruit_NeoPixel(NUMPIXELS, PIN, NEO_GRB + NEO_KHZ800); 
int delayval = 500; // delay for half a second 
SoftwareSerial gtSerial(8, 7); // Arduino RX, Arduino TX 
void setup() { 
 gtSerial.begin(9600);  // software serial port 
 pixels.begin(); // This initializes the NeoPixel library. 
} 
byte rx_byte = 0;        // stores received byte 
void loop() { 
 // check if byte available from USB port 
 if (gtSerial.available()) { 
   rx_byte = gtSerial.read(); 
   // write back to Android Things board for debugging 
   gtSerial.write(rx_byte); 
   switch( rx_byte ) { 
     case 'S': { 
       //Slowly progress green 
       for(int i=0;i<NUMPIXELS;i++){ 
         pixels.setPixelColor(i, pixels.Color(0,70,0));  
         pixels.show(); 
         delay(delayval); 
       } 
       break; 
     } 
     case 'F': { 
       //All LEDs off 
       for(int i=0;i<NUMPIXELS;i++){ 
         pixels.setPixelColor(i, pixels.Color(0,0,0)); 
       } 
       pixels.show(); 
       break; 
     } 
     case 'O': { 
       //All LEDs low blue 
       for(int i=0;i<NUMPIXELS;i++) { 
         pixels.setPixelColor(i, pixels.Color(0,0,40)); 
       } 
       pixels.show(); // This sends the updated pixel color to the hardware. 
       break; 
     } 
   } 
 } 
} 

When a conversation starts, the ring will slowly add more green lights, when the fan is on, the LED ring will be fully lit with a low-brightness blue, and when the fan is off without a conversation occurring, the LED ring will be off.

Conclusion

Taking a simple object and making it smart is a fun and exciting way to learn a new platform, like Android Things. It also provides an excellent reason to learn new things about electronics and other APIs, as this project is the first time I have really used a relay, dug into the Google Assistant or Actions APIs. Taking something small and continuously adding will eventually lead to something pretty cool. Hopefully you were able to learn a good deal about Android Thing, the Assistant API, Actions API, Awareness API and Firebase in order to create your own awesome smart devices. If you have any questions, the GitHub link for this project is posted at the bottom, and I'll try to answer questions in the comments.

Closing Note

I want to mention some of the things I would like to add to improve this project going forward, as no project is ever "done", so much as "done enough for now":

  • Use the other form of Google actions that don't need API.AI and the "talk to smart fan" phrasing.
  • Make a custom case via 3D printing, add my own motors. This project works great on its own, and is easier for others to replicate without 3D printers/the time to 3D print, but building something from the ground up is always nice.
  • WiFi connecting support in app via the Nearby API so users can connect the device to wifi without having to use developer tools.
  • Try my hand at making an iOS companion app using Firebase for turning the fan on and off.

Code

Source on GitHub
Smart fan with Google Assistant and Firebase integration

Schematics

Schematics
Smartfan with arduino w46wii84qp

Comments

Similar projects you might like

Voice Controlled K'nex Car

Project tutorial by Austin Wilson

  • 9,819 views
  • 4 comments
  • 52 respects

TV remote controlled Light and Fan

Project tutorial by Rishabh

  • 17,374 views
  • 12 comments
  • 62 respects

Enable Alexa Control to your Ceiling Fan

Project tutorial by Jithin Thulase

  • 3,124 views
  • 5 comments
  • 9 respects

Voice Controlled Robot using your Smart Phone

Project tutorial by Ahmed Yassin

  • 3,066 views
  • 0 comments
  • 7 respects

Voice Controlled RGB Lamp

Project tutorial by Pham Hoang Son

  • 3,003 views
  • 1 comment
  • 13 respects

WIZ750SR Based Remote Health Monitoring with Amazon Alexa

Project tutorial by Madhur Gupta

  • 1,059 views
  • 0 comments
  • 6 respects
Add projectSign up / Login