Letting the smart house know your mood
We now have a video from the web camera available for our use.
In the FrameGrabber
class, there is a Func
, which will be used for analysis functions. We need to create the function that will be passed on this that will enable emotions to be recognized.
Create a new function, EmotionAnalysisAsync
, that accepts a VideoFrame
as a parameter. The return type should be Task<CameraResult>
and the function should be marked as async
.
The frame
we get as a parameter is used to create a MemoryStream
containing the current frame. This will be in the JPG file format. We will find a face in this image, and we want to ensure that we specify that we want emotion attributes using the following code:
private async Task<CameraResult> EmotionAnalysisAsync (VideoFrame frame) { MemoryStream jpg = frame.Image.ToMemoryStream(".jpg", s_jpegParams); try { Face[] face = await _faceServiceClient.DetectAsync(jpg, true, false, new List<FaceAttributeType> { FaceAttributeType.Emotion }); EmotionScores emotions = face.First()?.FaceAttributes?.Emotion;
A successful call will result in an object containing all the emotion scores, as shown in the following code. The scores are what we want to return:
return new CameraResult { EmotionScores = emotions };
Catch any exceptions that may be thrown, returning null
when they are.
We need to assign the Initialize
function to the Func
. We also need to add an event handler each time we have a new result.
When a new result is obtained, we grab the EmotionScore
that is received, as shown in the following code. If it is null
or does not contain any elements, then we do not want to do anything else:
_frameGrabber.NewResultAvailable += OnResultAvailable; _frameGrabber.AnalysisFunction = EmotionAnalysisAsync; private void OnResultAvailable(object sender, FrameGrabber<CameraResult>.NewResultEventArgs e) { var analysisResult = e.Analysis.EmotionScores; if (analysisResult == null) return;
In the following code, we parse the emotion scores in AnalyseEmotions
, which we will look at in a bit:
string emotion = AnalyseEmotions(analysisResult); Application.Current.Dispatcher.Invoke(() => { SystemResponse = $"You seem to be {emotion} today."; }); }
Using the result from AnalyseEmotions
, we print a string to the result to indicate the current mood. This will need to be invoked from the current dispatcher, as the event has been triggered in another thread.
To get the current mood in a readable format, we parse the emotion scores in AnalyseEmotions
as follows:
private string AnalyseEmotions(Scores analysisResult) { string emotion = string.Empty; var sortedEmotions = analysisResult.ToRankedList(); string currentEmotion = sortedEmotions.First().Key;
With the Scores
we get, we call a ToRankedList
function. This will return a list of KeyValuePair
, containing each emotion, along with the corresponding confidence. The first one will be the most likely, the second will be the second most likely, and so on. We only care about the most likely one, so we select it.
With the top emotion score selected, we use a switch
statement to find the correct emotion. This is returned and printed to the result, as follows:
switch(currentEmotion) { case "Anger": emotion = "angry"; break; case "Contempt": emotion = "contempt"; break; case "Disgust": emotion = "disgusted"; break; case "Fear": emotion = "scared"; break; case "Happiness": emotion = "happy"; break; case "Neutral": default: emotion = "neutral"; break; case "Sadness": emotion = "sad"; break; case "Suprise": emotion = "suprised"; break; } return emotion; }
The last piece of the puzzle is to make sure that the analysis is being executed at a specified interval. In the StartCamera
function, add the following line, just before calling StartProcessingCamera
:
_frameGrabber.TriggerAnalysisOnInterval(TimeSpan.FromSeconds(5));
This will trigger an emotion analysis to be called every fifth second.
When I have a smile on my face, the application now knows that I am happy and can provide further interaction accordingly. If we compile and run the example, we should get results like those shown in the following screenshots:
As my mood changes to neutral, the application detects this as well: