Bixby Developer Center

Guides

Hands-Free and Multiple Devices Design Guide

This guide is an expansion of the the Designing with Bixby Views design guide, and explains additional modes that users might experience with Bixby, as well as how best to design with these modes in mind.

Hands-Free and Hands-On Mode

There are two ways to invoke Bixby:

Diagram showing two ways to invoke Bixby: Hands-Free Mode and Hands-On Mode

  • Hands-free mode: Users say "Hi Bixby", and then ask a question or make a request. The mic turns off when the user is done speaking. When the dialog from Bixby finishes, the mic automatically turns on to listen to the user's response.
  • Hands-on mode: Users hold down the Bixby button or tap the Bixby icon on the screen in the lower-left corner when Bixby is open, and then ask a question or make a request. The user must press and hold the Bixby button again to answer Bixby's response to the user's original request.

Bixby should respond differently for these two methods:

  • In hands-free mode, Bixby operates as if the user can only communicate by voice, meaning they will not be looking at the screen. Bixby's spoken dialog has to be more robust in this case, while the screen and the related Bixby Views are more minimal. You can use the $handsFree expression language variable to check if the user is in hands-free mode, and then change the view accordingly.
  • In hands-on mode, Bixby operates as if the user can see and interact with the screen. In this case, Bixby's spoken dialog should be minimal, while the screen and Bixby Views show more robust information, with touch navigation or buttons if necessary.

Video Tutorial: Determining if a User Is Hands-On or Hands-Free When Using Bixby

The following video tutorial describes the difference between hands-on versus hands-free mode, and how to determine which one a user is using.

Comparison of Hands-Free and Hands-On Modes

Let's compare hands-free mode and hands-on mode with the Space Resorts sample capsule. In this sample capsule, users can book hotel rooms on different planets. You can consider this a case study that you can follow while designing your own capsules.

Conversation Flow in Hands-Free Mode

This is the typical conversation flow a user might have with Bixby while searching for and booking a room at a specific space resort. First, a user wakes Bixby up with the wake phrase, and then the user makes the request. Bixby returns the results, giving a recommendation on which result to choose. The user confirms this choice, and Bixby then continues making the reservation. Afterwards, it confirms with the user to make that reservation. After the user again gives a verbal confirmation, Bixby lets the user know that the reservation is made!

Note

In hands-free mode, the results are included in the dialog so users can hear their choices.

First, the user wakes up Bixby:

"Hi Bixby!"

Bixby is then invoked:

Bixby waking up in Hands-Free Mode

The user makes the request:

"I want to book a room at the best space resort in the galaxy on November 30, 2020 for two nights"

Bixby processes the request:

Bixby processing a request in Hands-Free Mode

Afterwards, Bixby returns multiple results.

Bixby: I found 8 resorts in The Milky Way with availability on November 30, 2020. The highest rated resort is Venus Space Spa. It gets 4.7 stars and is 162 million miles away. Want to choose this one?

On the screen: I found 8 resorts in The Milky Way with availability on November 30, 2020. Want to choose this one?

Bixby displaying results in Hands-Free Mode

The user responds to Bixby's question:

"Yes"

And Bixby responds:

OK, I have your reservation for Venus Space Spa on November 30th to December 2nd, 2020. The total price is 500 galaxy coins. Ready to book it?

On the screen is the confirmation prompt: Ready to book it?

Bixby displaying a confirmation prompt in Hands-Free Mode

The user verbally confirms:

"Yes"

Bixby then makes the reservation and informs the user:

Done! You’ll get a confirmation email in your inbox shortly.

The screen matches this dialog.

Bixby receipt for hands-eyes free mode

This is the end of this conversation flow.

Conversation Flow in Hands-On Mode

To understand the difference between hands-on mode and hands-free mode, let's walk through a hands-on mode flow:

First, the user holds down the Bixby button, which invokes Bixby. Then they can ask a request:

User: "I want to book a room at the best space resort in the galaxy on November 30, 2020 for two nights." Bixby: I found 8 resorts in The Milky Way with availability on November 30, 2020.

In Bixby Views, it shows a Result Moment with a List View.

Bixby displaying results in Hands-On Mode

The user can tap the first card in the list.

Bixby responds by displaying the Detail View of the Result Moment, but no dialog is spoken.

Bixby showing a detail view of a result moment in Hands-On Mode

The user can then tap the "Make Reservation" conversation driver at the bottom of the result view.

Bixby asks Ready to confirm your reservation?, while Bixby Views shows a Confirmation Moment in a prompt view.

Bixby displaying a confirmation prompt in Hands-On Mode

The user can tap the "Book" conversation driver button at the bottom of the screen and Bixby makes the reservation.

Bixby reports:

Done! You'll get a confirmation email in your inbox shortly.

The screen shows the Receipt View of the Result Moment.

This is the end of the conversation flow.

Four Types of Hands-Free Mode

There are four main types of hands-free mode. For more information on how to implement these modes, see the Hands-Free List Navigation Developers' Guide.

Your options are as follows:

Read One

When in hands-free mode, it’s often difficult for a user to consume lots of information at once. To make this simpler, use the Read One mode with the read-one key, which presents one result at a time with the relevant information. This lets users make a decision if this is the one they want or if they want to hear the next one. Once users choose the one they want, Bixby presents the actions they can perform on that item. While it could seem like it adds too many steps, this reduces the complexity for users, and as a developer, it is your job to sort the information such that the most likely results are read first.

"Find ramen restaurants near me"

Bixby says: I found some ramen restaurants nearby. The first one is Tonkotsu-ya. It gets 4.2 stars and is 1 mile away. Would you like to go here?

Interaction flow for read-one list navigation mode

  • In hands-free mode, Bixby displays each option one by one. The next option will show when the user says "Next": A single option displayed in hands-free mode using read-one list navigation
  • In hands-on mode, Bixby displays each option as a card in a vertical list. Users can scroll up to see more options: A list of options displayed in hands-on mode
Read a Few

In Read a Few mode, you specify the read-many key, and the user is making simple decisions on information they probably already know about. For example, they already know which phone number they want to call and are simply disambiguating between a set of options.

Note

We recommend that you limit results to about three or four at a time, so you don't overwhelm the user. You can specify this with the page-size child key of read-many.

"Call my wife"

Bixby says: There are 3 numbers saved for [wife’s name]. Office, mobile, and home. Which one did you want to call?

The displayed dialog reads Which number did you want to call?

Interaction flow for read-many list navigation mode, reading a few options at a time

In hands-free mode for read-many, you should display the most relevant options in the list. For example, you might have six options from your contacts list, but Bixby selects the top three most-recently contacted in your history. Use the highlights key to to pick the best options to present.

A list showing only the most relevant choices from a larger set of possible options

Read All

Only when the information makes sense as a holistic group should you read all the options using read-many. This should not be done when a user is trying to make a decision between items, as it forces the user to process too much information at once.

User: "Delete my alarm" Bixby: You have alarms set for 7am, 8am, 8:30am, 9am and 2pm. Which one would you like to delete? User: "7am" Bixby: I’ve deleted your 7am alarm.

Interaction flow for read-many list navigation mode, reading all options at once

You would set read-many to be the size() of the set of results you return: page-size (size(this)), where this is whatever item you're returning. If you aren't setting your own navigation mode, you just set the spoken-summary key in the view file to read each item out.

In hands-free mode, since Bixby reads all the options, the view is the same as in hands-on mode:

A list showing all the possible options

Read None

Use the read none mode when you are sure the user would know an answer from a common set of data, such as days of the week.

User: "Next week" Bixby: Which day?

A list shown using the read-none navigation mode

You specify the read-none key in your navigation mode of the view.

When to Define Hands-Free Mode

Since Bixby is a next-generation, conversational assistant platform, designing hands-free mode is necessary. Here is the best way to define the mode in the following workflow.

  1. Step 1: Define Business Goals
  2. Step 2: Create the Main Flows
  3. Step 3: Write Bixby's Dialog
  4. Step 4: Find the Best Components
  5. Step 5: Implement Your Views
  6. Step 6: Repeat for Success
Step 1: Define Business Goals
Step 2: Create the Main Flows
Step 3: Write Bixby's Dialog

Write the Dialog for both hands-on mode and hands-free mode.

For guidance on how to write the best dialog for your capsule, see Writing Dialog in the Design Guides. For more information how to implement dialog, see Refining Dialog in the Developers' Guides.

Step 4: Find the Best Components

Choose the best component for each view. For more guidance on which components to use for each moment, you can read the Components and Layout Patterns Developers' Guide.

Step 5: Implement Your Views
  • Use Bixby Developer Studio to create your various views, dialogs, and navigation support, if necessary.
  • Test your capsule (and all of its various targets) in the Simulator.
Step 6: Repeat for Success
  • Iterate these steps for each device target to optimize the user experience.
  • Add any device specific handling as needed.

Cross-Device Portability

Users expect to be able to use your capsule with every Bixby device they own. Therefore, users should have a consistent but appropriate experience across every Bixby device. This also simplifies the design and development, so you only have to manage one experience across all devices. When new devices are released by Samsung, your capsule will work out of the box as long as you’ve added the target to your capsule.bxb file.

Note

Bixby Views makes it easier to build for multiple devices. You can build for one device, and once you choose a component, it will map across devices that support that component. Consult the Device Support for Views Components table for specifics on which components are supported on which devices.

Tip: Build for one device, test your views in the Simulator, then see where you can tweak and customize for the other devices.

Note

Not all components are supported on all devices, such as video on watch devices. Additionally, components might display differently depending on the device.

Make sure to check the reference for each component (including how it might display on devices with the interactive demo). You should also test your capsule in the Simulator to see how your components display on different devices. See Settings View in the Simulator Guide for more information.

Hands-Free and Hands-On Modes with Multiple Devices

Bixby Views creates layouts differently depends on the mode but the components are the same for both hands-free and hands-on modes. Depending on the mode, the screen's layout can change to fit the mode. You can use the $handsFree expression language variable to switch the views.

Here is the space resorts example:

User: "I want to book a room at the best space resort in the galaxy on November 30, 2020 for two nights"

Mobile, Family Hub (Fridge), and Watch Devices

Bixby: I found 8 resorts in The Milky Way with availability on November 30, 2020. The highest rated resort is Venus Space Spa. It gets 4.7 stars and is 162 million miles away. Want to choose this one?

A montage of the way different devices would present a list of space resorts in response to a user query

Far-field Mode for TV Devices

In Far-field Mode on the TV, Bixby assumes the user is looking at the TV screen, but says "Hi Bixby", so the results are not read aloud, as the user can see them.

The same list of resorts in response to a user query as presented on a TV