Showing posts with label gupnp. Show all posts
Showing posts with label gupnp. Show all posts

Thursday, 7 August 2014

Sonos UPnP Development - Let's play some music!

So far we have seen how to discover media devices, how to monitor what is being played on them, and how to pause a media device.

I present here a way to play something on a Sonos Zoneplayer (or, for that matter, any other media renderer).

The code here is based on the code in my post Sonos UPnP Development - Controlling Playback which has been modified so that, instead of pausing the Zoneplayers, it now tries to play a file on one of them.

Firstly, an important point to consider here is that, unlike UPnPClient5 which searched for, and addressed all of the zoneplayers, this version needs to target a specific zoneplayer. We, therefore need a way to specify this.

We also need to specify a file to play. Browsing your Sonos library, looking for a specific file, selecting it, and playing it is a subject in it's own right and, hopefully, I'll get around to covering it eventually. I intend to keep this simple so I'm going to play a URI from the Internet which points to an MP3 file. So our program needs a way for use to specify this file.

The code below include these enhancements:


Rather than hardcode this information, I have made them parameters so you call the program thus:
./UPnPClient7 <zoneplayer> <uri>

The code to get these parameters is mostly at lines 128 to 135. These are held in variables declared at lines 18 and 20 along with another variable, rendererFound, which is used as a flag for error reporting.

As before in UPnPClient5 we discover all of the media render devices and create a list of them. The callback function device_proxy_available_cb is used for this as before.

However, in this case we do a test for the media renderer name. This is at lines 98 to 103. Not we are only matching on the first few characters, so we don't have to provide the full name of the ZP. If we are not careful, we could match multiple ZPs, and in this case the program will play the file on all of them. Note this is NOT the same as linking the zones.

When a match is made, we call a new utility function play_stream which is defined at lines 38 to 86. This takes two parameters: a pointer to the renderer, and the URI.

The instruction to set the AVTransport to use this URI is at lines 68 to 72. This is a call to the AVTransport service "SetAVTransportURI", which does exactly as it says on the tin. This requires a few parameters:

  • InstanceID - Always 0
  • CurrentURI - Our URI
  • CurrentURIMetadata - This describes the URI in more detail
The CurrentURIMetadata parameter is a string containing an encoded DIDL-Lite description of the file. I have hacked this into a couple of strings "metadata1" and "metadata2" and used strcat at lines 59 to 61 to create the final metadata string which contains our URL. This is a nasty, hacky way of doing this, but it works and it also exposes the raw data so you can see what it is.

We then set this as the CurrentURI for our AVTransport. If this is successful, we then instruct the renderer to play (lines 74 to 77).

Job done!

If I call this as follows:
./UPnPClient7 Office http://www.archive.org/download/jj2008-06-14.mk4/jj2008-06-14d1t03_64kb.mp3

I will have Jack Johnson start to play in my office.


Note that this is actually not added to the Sonos queue. It is played directly. Adding stuff to the Sonos Queue is something which involves some Sonos-specific capability. At the moment I'm concentrating on standard UPnP capability. I hope to cover the Sonos queue in a future article.

Sonos UPnP Development - Getting more information from the renderer

This next version is an evolution of the version in my previous post Sonos UPnP Development - Accessing the AV Transport service.  It has been extended to display some basic metadata for any tracks that are found.

The main changes from the previous version (UPnPClient4) are:

  1.  A new callback function on_didl_object_available (lines 31-44) which prints out metadata for a track object including the full URL of the album art.
  2. New code in the existing on_last_change callback which creates a parser for the XML track object (lines 86-96)
  3. A change to the service subscription to indicate we want to get the "CurrentTrackMetaData" information (line 69).
  4. I have also added the ability to specify the timeout on the command line (lines 155-158 & line 194). For brevity I haven't done any serious condition/error checking here.
If you run this program with no parameter it will default to run forever. You will need to force close it with CTRL-C. If you specify a parameter in seconds, it will run for that many seconds before exiting.

The way this works is similar to UPnPClient4, on which it is based. However, within the last_changed callback handler (which, as you recall, is called whenever the AVTransport service changes state) we request and process the track metadata.

The track metadata is contained in an XML document which looks like this:

If you uncomment the "g_print" on line 88 it will print out this XML document for you to see (it prints on a single line - I have formatted the above sample to make it easier to read).

Looking through this document you can see it contains quite a lot of information about the currently playing track. This is contained in an XML schema known as "DIDL-LIte" which is a cut-down version of DIDL. In UPnP AV this is used for a lot of things including content directories.

Luckily in our case the gupnp toolkit provides us with some nice capabilities to make it easy to parse this and extract information from it. The way this works is you needs to create a parser (line 58). When the parser is run it scans the DIDL document and signals every time it gets a new sub-object. In our case, the sub-object is a DIDL Item containing the track metadata. We need to create a callback function to handle this (which we have called on_didl_object_available and connect this to the parse before we execute it (line 84). We then run the parser and let it do it's thing.

In the parser callback handler on_didl_object_available (lines 30 -38) we receive the metadata object from the parser as a GUPnPDIDLLiteObject object. We can then use the gupnp convenience functions to extract specific data from this.

If you run this, you should get something similar to:

If you play tracks whilst the program is running, or if the track changes, the new details will print out.

Of particular note is the "Album Art:" information. The metadata returns a partial URL of the Album Art. This URL is relative to the Device URL. To get the full url we have to append the device URL to it (http://<ZP_IP_ADDRESS>:1400). This version of the code does this for us so that the URL output (highlighted above) can be cut and paste into the address bar of a web brower in order to view the album art.  If you are listening to Internet radio you should get the station logo for the station, if there is one.

The change in this updated version is achieved by pushing a pointer to the renderer object into the on_didl_object_available callback so that the callback code can reference it (the change here is made at line 89). Then in on_didl_object_available we dereference the renderer object and use it to look up the base URL (line 39) and then construct a new URL using the device base URL and the album art partial URL (line 40).

Clearly this basic capability could be extended to a number of uses, such as:

  • Keep a GUI client up to date with current playing information
  • Create a logging service to monitor what tracks had been played
  • Create a "most played" or "recently played" playlist from the log
  • Create a screensaver or media centre display based on what a specified zone was playing
  • Look up artist information, artwork, or lyrics on the Internet and display them
  • Push Album Art of currentply playing track to a UPnP TV or picture frame
As well as many more applications of this.

Sonos UPnP Development - Controlling Playback

In previous posts I have introduced the important concepts of UPnP Services, and that they offer Events and Actions. We have seen how it is possible to subscribe to an Event in order to be notified of when something interesting happens.

Now we are going to look at how we can use Actions to actually control something. This is a very simplistic example, but should illustrate the basic concepts.

The following code, when run, will do a "Pause All" of the renderer devices it finds on your network. This code is based mostly on the version from my previous post UPnP Discovery with Sonos, event driven:


The main changes to this version are that that we no longer print anything out. The device_info method which was used to print out device details in v3 has been replaced by a new function pause_device which pauses the device instead (lines 20-35). The key function call is on lines 32-34. This requires a little explanation.

The call takes multiple parameters including the service being addressed (in our case, the AVTransport service which we have previously obtained a reference to, the action requested, which "Pause", a flag to pass back en error and then a bunch of parameters. The parameters depending on the action being triggered, and this is within the device description document. In the case of AVTransport it's also within the UPnP specifications at upnp.org.

The following image shows how this looks when using a development tool such as Device Spy. The left column shows the Actions supported by AVTransport. I have selected "Pause" and the right column shows me the parameter this Action requires. It needs a single parameter called "InstanceID" of type "ui4".
Using Device Spy developer tool to examine the AVTransport interface
So in our call on line 32 (and according the the GUPnP documentation for "gupnp_service_proxy_send_action") we need to specify this parameter a as a "tuple" comprising the parameter name, the parameter type and the parameter value. In this case we want the first (and only) AVTransport control. The count starts at 0. We finish wth a "NULL" to indicate there are no more parameters.

This call is a simple one with no return parameters. If we had return parameters to deal with we would need to handle this differently.

Also, the function returns an error if the action failed. In our case it will fail if the "Pause" cannot be carried out. One reason for this would be that the devices is already paused. In this case we don't care about this so we ignore it. A more complete version might keep track of the transport state of the player, and only send a "Pause" action to those which were not already paused.

When run, this program does not give any output. It simply pauses every discovered player on the network. Clearly this could be used to connect to a home automation system, such that when the phone rang or the doorbell was pressed, all of the zones were paused.

Sonos UPnP Development - Accessing the AV Transport service

This is an iteration of the code I produced in my post UPnP Discovery with Sonos - Event Driven. Now we have discovered the devices on the network, we should try to do something useful with them. In this post I show how to pull some interesting information from them.

More specifically, I am going to reference the AVTransport service which is part of the MediaRender specification. This introduces some other another important concepts in UPnP: Services, Events and Actions.

Services

Functionality in UPnP devices is primarily exposed via services. Services are advertised on the network and can be discovered. The advertisement for the service includes the technical document which explains its capabilities. This document may be viewed using UPnP Device Spy, or in the case of standards based services like AV the specifications are also published at upnp.org.

Services expose two types of capability: actions and events.

Actions

If you want to perform some function on the device, you call an appropriate action. For a media player device, for example, one action might be to start playing the current track. We will look at actions in a later article.

Events

When a device does something that the rest of the world might want to know about, it can trigger an event. The events that a device may trigger are part of it's device specification document which can be viewed using a UPnP Development toolkit like Device Spy, or in the case of AVTRansport by looking at the UPnP AV specifications at upnp.org.

Anything else on the network that is interested can register it's interest in knowing about that event by subscribing to it. It may also unsubscribe if it is no longer interested. If it is subscribed to an event, it will receive an event every time the target device does the thing which causes the event.

For instance, n the case of a MediaRenderer object there is a service called AVTransport which represents the status and capability of the renderer transport.

In this article we are going to look at this and used the event called "last_changed" which is triggered whenever the Transport state changes.

The code below is based on the version in my last article with some modifications:






The first thing to notice is I have removed a bunch of code:

device_info as been removed and the code to print the hashtable out in main_loop_timeout has also gone. This code was really only there in the previous version so that you could see what was going on.

I have also removed the "Device added:" and "Device removed:" print statements from device_proxy_available_cb and device_proxy_unavailable_cb so none of the console messages that were in the last version are there any more.

The major additions here are as follows:

Firstly we have added a new callback function on_last_change (lines 30 to 75). This function is passed the UDN of the device as well as some data. The data is an XML document describing the transport state.

A couple of things are important here. Firstly we used the UDN (Unique Device Name) of the device as the key when we stored it in the GHashTable. This means we can use the UDN to quickly find the renderer object.

Secondly, the XML doucment needs to be parsed to pull useful information from it. Luckily GUPnP provide a set of libraries (gupnp-av) to help with this.

So, this function parses the "TransportState" information from the XML user data, looks up the renderer from our GHashTable using the UDN, and prints out the device and state information.

We then need to subscribe to the AVTransport service. We do this in the device_proxy_available_cb. Whenever this is called it represents a new renderer device being found. We do the following:

1. look up the AVTransport service (line 92)
2. register our interest in the "LastChange" event, and point this at our on_last_change callback function (line 97-101)
3. and turn on the subscription (line 102)

From then on, any time the renderer transport state changes, it will fire a "LastChanged" event at us which will run our little function.

A few other, minor changes are needed to support this:
  • I have #included the gupnp-av headers (line 8)
  • I have created a parser object and initialised it (lines 16 and 135)
  • I have also moved the timeout value to a #define called "RUN_TIME" (lines 12 and 162) so it is easier to change and set it to 10 seconds
If you run this you should get something like this:

If you clear the queue you will get:


As you can see this starts to offer some interesting possibilities. You could build a GUI around this which not only listed all of the zones, but also listed whether they were currently playing or paused, all in real time.

You could also use this (with the appropriate hardware and drivers) to send on/off messages to an amplifier on a specific zone via Infrared (lirc) or a 12V trigger (parallel port?)

UPnP Discovery with Sonos, event driven

Following on from my post UPnP Discovery with Sonos Players I present a revised version which doesn't add much in terms of functionality, but which is a bit tidier and has some changes to how we handle discovery.

In the last versions, we simply printed out whatever we discovered. Processing discovery on-the-fly like this will not suit a lot of real-world programs, especially if it involves something like a GUI. What we need to do is to store the results of out discovery so that we can reference them whenever we want.

This raises an important point about discovery. UPnP discovery is not a "one shot" activity. It is a continuous processes. You start it, and it then runs continuously until you stop it. There is no magic flag that says "discovery has finished".  Normally the start and end of discovery is dictated by the start and end of your program. In my program, discovery (and the rest of the program) is stopped by me running a 2 second timer, after which time discovery is stopped and the program exits.

This is a very important concept to understand as, during the continuous discovery process devices can come and go from the network. Your program has to be able to accommodate devices appearing and disappearing. Of course, we are used to that with the Sonos controllers: if we add a new player to the system it magically appears in the zone menu. If you disconnect power from a Zoneplayer it (eventually) disappears.

So what we need to do in any long-lived UPnP program is use the discovery to maintain a local copy of what is currently on the network.

This next piece of code does that. It is based on the previous versions but with some significant changes. I am using a GHashTable from the Glib collections library. Once again this is common on Linux systems. In fact it's a dependency for GUPnP so it will be there if GUPnP is.

GHashTable gives us a nice key-value table with the ability to easily add and remove items.

Here is the code:


Firstly, I have added some new functions as follows:

device_info (lines 19-21) is a simple print utility function which, given a key value and a renderer reference, prints out some information about that renderer. This is used by main_loop_timeout (lines 26-34) which has been altered to print out the contents of the GHashTable. Remember, main_loop_timeout is called to signal the end of the program.

We already had the callback function device_proxy_available_cb (lines 40-50) but now we have added a new callback function device_proxy_unavailable_cb. These now respectively add or remove the discovered renderer device to the GHashTable. What this means is that at any time our GHashTable should contain a current list of the available devices in the network.

(Note that a device which has its power removed will not normally announce it's departure from the network, so this list is not 100% correct).

These callback functions also print data about the added or removed device to the screen, so you can see what it's doing.

To accommodate these changes, we need to make some simple changes to the main program. Firstly we need to create our empty GHashTable, which I have called renderers (line 79). Then we need to register our interest in the device-proxy-unavailable signal which occurs when a device leaves the network, and point it to use our new callback handler (lines 97-99).

I think the only other change worth mentioning is that I have made the MediaRenderer URN into a constant (line 10 and line 87) as it's better coding practice.

On running this, you should get something similar to the following:

The first block shows the devices first being discovered. Then the 2 second timeout kicks in and this prints out the current contents of the renderers list before stopping discovery. Note that before it stops discovery the system automatically tidies up for us, removing all of the discovered devices (the final block).

If you wish, change the timeout value at line 106 from 2 seconds to something longer if you want to make this more obvious.

Also note that, in this case, a new media renderer showed up on my list "Intel AV Renderer (laptop)". This is the Intel AV Renderer which is part of their UPnP developer toolkit. I ran this on my Windows laptop to show that this is a generic UPnP AV capability that's being used.

If you set the timeout to a longer value, you can launch and close the Intel AV renderer and see it being added to, and removed from the list of devices.

As you can imagine, this could be wrapped in some sort of GUI to give a constantly updating list of the Media Renderer devices on the network, similar to the Zone list on the Sonos controller.

UPnP Discovery with Sonos players

Following on from my previous article on Discovering UPnP Devices this post describes how to be a bit more specific about what we are searching for. Specifically, I will be searching for Sonos music streaming devices.

A crude way of doing this would be to alter the device_proxy_available_cb callback function with an if/then statement that selectively printed out device information based on the values in the model or Friendly Name information, but UPnP provides us with a better approach: When you initiate a discovery, your discovery request includes a specification of what you want to discover. Previously we used the specification upnp:rootdevice which discovered every device. We can limit this by saying we are only interested in certain types of devices.

UPnP devices have a hierarchy. Typically there is a root device which can contain one or more sub-devices. These sub-devices represent specific functionality. Sonos Zoneplayers contain several sub-devices which do different things, but in this case I am interested in the part which plays music. In UPnP terms this is a UPnP AV MediaRenderer.

Each UPnP device type has a URN which identifies what type of device or sub-device it is. The standard URNs for things like UPnP AV are described in the UPnP specification documents at www.upnp.org but these are also available directly from the device itself.

The easiest way to get these is by using a developers too like UPnP Device Spy. Note that Device Spy and similar tools are not "sniffers" as some uninformed people like to claim. They are developer tools. They discover devices and provide an easy way to query them for their capabilities, status and even to execute actions on them.

In this case, examining the UPnP documentation, or using Device Spy and looking at a Zoneplayer, reveals the URN for the UPnP AV Media Renderer is urn:schemas-upnp-org:device:MediaRenderer:1. So if we use that in our discovery, we will only get devices which match this specification returned.

The previous code I present discovered every UPnP device on the network. By modifying this slightly, we can make it so it only returns media player devices. This modified version is shown below with the lines that have changed highlighted:


As you can see, the change is very simple. If we now run this we will get something like the following:
A few things are noteworthy here:

  • We only have Media Renderer devices. The firewall and the Sonos WD100 dock no longer show up
  • The format of the output is slightly different. This is because we have been returned the MediaRenderer sub-device instead of the root device
  • We no longer get the IP address. This isn't a problem as you really should never need it. If you really must have it (e.g. for display info), there are ways to find it.
  • In this case we get the Zoneplayer zone name
  • In the example above there are no non-Sonos MediaRenderer devices on the network. If there were they would show up in this list
So that is how to restrict UPnP discovery to a specific type of device.

Friday, 16 March 2012

Discovering UPnP devices



Here I present to you a piece of code that does a very basic UPnP discovery of devices on the network. This is normally the starting point for any UPnP control program.

Discovery is an important part of UPnP. It allows devices to be truly "plug 'n' play". It negates the need for the user to ave to configure the network settings on devices before they plug them in. This relies on the network supporting IP address allocation using DHCP, but DHCP is almost universally used in home networks as well as most corporate networks.

One potential problem with DHCP is that it is difficult to know in advance what IP addresses are ging to be assigned to devices. It is possible to configure DHCP to statically map IP addresses to specific devices, but this requires a configuraion step and removes the "plug 'n' play" element. Discovery solves this problem.

The code I present requires the Gnome GUPnP libraries (as well as dependent libraries such as GSSDP). These are readily available on most Linux distributions. You will need the development packages.

The code below is a very primitive control point. When run it will discover all of the UPnP devices on the local network, and then exit. This code is based on the GUPnP example control point which has been extended slightly. There are comments in the code explaining each major part:
//============================================================================
// Name : UPnPDiscovery.cpp
// Author : Majik 
// Version : 2.0 
//============================================================================ 
#include  
static GMainLoop *main_loop; 
/* This is our callback method to terminate the main loop
 * after the timeout has expired */

 static gboolean main_loop_timeout(void *data)
 {
   g_main_loop_quit (main_loop);
   return 0;
 }
 /* This is our callback method to handle new devices
 * which have been discovered. It simply prints the
 * device model and friendly name to to console */
 
 static void device_proxy_available_cb(GUPnPControlPoint *cp, GUPnPDeviceProxy *proxy)
 {
    GUPnPDeviceInfo* gupnp_device_info = GUPNP_DEVICE_INFO(proxy);
    g_print("Device model: %s", gupnp_device_info_get_model_name(gupnp_device_info));
    g_print("\tFriendly name: %s\n", gupnp_device_info_get_friendly_name(gupnp_device_info));
 }
 /*
  * This is the main program */
 
 int main (int argc, char **argv)
 {
    GUPnPContext *context;
    GUPnPControlPoint *cp;
    /* Required initialisation */
    g_thread_init (NULL);
    g_type_init ();
    /* Create a new GUPnP Context. By here we are using the default GLib main context, and connecting to the current machine's default IP on an automatically generated port. */
    context = gupnp_context_new (NULL, NULL, 0, NULL);

    /* Create a Control Point targeting UPnP Root devices */
   
    cp = gupnp_control_point_new(context, "upnp:rootdevice");
   /* The device-proxy-available signal is emitted when any devices which match our target are found, so connect to it */
    g_signal_connect (cp, "device-proxy-available", G_CALLBACK (device_proxy_available_cb), NULL);
   
    /* Tell the Control Point to start searching */
    gssdp_resource_browser_set_active (GSSDP_RESOURCE_BROWSER (cp), TRUE);

    /* Set a timeout of 2 seconds to finish processing */   
    g_timeout_add_seconds (2, main_loop_timeout, NULL);

    /* Enter the main loop. This will start the search and result in callbacks to device_proxy_available_cb. */
    main_loop = g_main_loop_new (NULL, FALSE);
    g_main_loop_run (main_loop);

    /* Clean up */
    g_main_loop_unref (main_loop);
    g_object_unref (cp);
    g_object_unref (context);
    return 0;
 } 
The way this works is using "callbacks", which is a very common technique in UPnP development as well as in other types of asynchronous programming. There are two callback functions:

device_proxy_available_cb - This is called whenever a new device is discovered on the network.
main_loop_timeout - This is called by a timer after a set period. It is used to terminate the program, otherwise it would run forever waiting for new devices to appear on the network

A typical output when this is run is as follows:

Device model: MiniUPnPd Friendly name: WANConnectionDevice
Device model: Sonos ZonePlayer 100 Friendly name: 192.168.0.203 - Sonos ZonePlayer
Device model: Sonos ZonePlayer S5 Friendly name: 192.168.0.204 - Sonos ZonePlayer
Device model: Sonos ZonePlayer S5 Friendly name: 192.168.0.206 - Sonos ZonePlayer
Device model: Sonos WD100 Friendly name: 192.168.0.205 - Sonos Wireless Dock


As you can see, this discovers not only Sonos devices but other UPnP devices on the network. In my case it discovers my UPnP IGD device (my firewall).


Why is programming Sonos control apps so complex?


Before I launch into actual code, it's worth an explanation about the apparent complexity of the code being presented. To someone more familiar with "macro" programming and scripting than actual development, the code I am publishing may seem unnecessarily complex. The fact is this is what most real world software apps look like because most real world problems are a lot more complex than you might at first think.

In particular, here we are having to deal with asynchronous network communication. If your experience is limited from simple integration tasks such as configuring an "all-in-one" remote control device, you've probably not come across this. There are many simple control systems commonly used in Home Automation. Often these are one way (for example, IR remote controls) or use simple command-response (as do many RS232 controlled devices). These are relatively easy to program, but suffer from being inflexible and limited in terms of control capabilities.

One way control systems have no way to pull back status from the devices they control. This means they can often be unreliable, confusing to user, or that they require devices to be in a known starting state. They also cannot pull back useful information to the user, such as track data or album art.

Simple two-way control systems are better, but normally only allow one controller at a time, and that controller can only do one thing at a time. A good example of this is the iPod dock interface which can support album art transfer, but whilst the album art is being transferred over it's rather slow serial interface, you cannot perform any other actions.

The other problem with many other "simpler" HA protocols is that they require distinct work per device: you have to know there is a device of a specific type, and with a specific protocol. In the case of RS232/485 control protocols you also have to physically install the piece of wire to the device. Even if the device has an Ethernet interface, you have to statically map the IP address on the network and then point something at that IP address. This is hardly "plug and play".

UPnP is designed to support a much richer feature set than simple one/two way command protocols whilst supporting multiple simultaneous controllers which can each perform multiple concurrent actions. It's also designed to support automatic registration and announcement of new devices as well as a description of their capabilities.

This is a far richer integration and control capability than the ones that most HA installers deal with, and it requires a much richer environment and it comes at the expense of some added complexity at the coding level. This may be foreign to most HA installers and end-users, but it will be very familiar to IT Integrators who deal with similar complex protocols in Enterprise and Service Provider environments all the time.

I have heard HA installers ask: "why can't Sonos control be integrated into all-in-one remote control systems?". Worse still, some will claim it's because the protocols are somehow "secret" or "protected". In doing so they are showing their gross lack of understanding of technology: such a view is incompatible with a basic level of competence in HA installations. The real answer is that most commonly used HA systems are too limited to cope with control of complex UPnP AV devices. For instance, very few remote control devices could cope with the demands of browsing the Spotify music library.

Of course, that doesn't preclude some basic level of control, such as volume control. This would require some sort of bridge between the legacy control protocol and UPnP. It's perfectly possible to do this and some of the better HA installers have demonstrated capability in this area, but it does require someone who knows what they are doing to do this integration.

Thursday, 19 January 2012

Myth 2: Sonos doesn't have an open API

This myth is usually repeated in several different forms, including "Sonos has a closed API" and even "Sonos won't let you use it's proprietary API".

I have heard these statements from people who claim to be "integrators" when what their job title should really be "installer" (they wouldn't know what integration was if it hit them in the eye).

I have heard this from ignorant people who assume that because there aren't dozens of Sonos apps written by third parties that this must be the reason.

I have heard it from fans and employees of other vendors who are trying to spread FUD about a competing product.

The bottom line...
It isn't true!


The Sonos control API is based on UPnP which is one of the most open commercial home control protocols there is. All of the specifications can be downloaded for free from www.upnp.org.

Furthermore, much of the functionality is based on UPnP-AV, which is the AV specification from www.upnp.org. Again, the specification documents for this are available free of charge from www.upnp.org. Here's an example:
UPnP Device AV Architecture 1.1

With these specifications, and these specifications alone, you can perform a whole host of activities with your Sonos setup including:

  • Getting a list of devices on the network, their name, device type, IP address, etc.
  • Getting the "tranport state" of a Sonos zone (whether it's playing, paused,. etc.)
  • Getting track information about what any zone is playing, including cover art (if available)
  • Setting a track to play, or pausing it on a specific zone, or all zones.
  • Changing the volume,muting and unmuting on any zone
  • Browsing the Sonos index by Track, Artists, Album, Folder, etc.
  • Browsing Sonos playlists
  • Seeing changes in the status of any zone, such as the player changing from paused to playing
That's a pretty good range of integration capabilities, right?
On top of that, there's a range of Sonos-specific capabilities that, whilst not standard UPnP, are published via the UPnP interface. That means they are visible to end users and some idea of how to use them is presented. Most of these are fairly obviously named so it doesn't take much more than a little experimentation to work out how to use them. It would be nice to have documentation, but as none exists part of my aim here is to use this blog as an unofficial documentation of some of these features, as and when I get around to addressing them.
The sort of things that are available:

  • Adding songs to the Sonos queue (note you can play tracks using standard UPnP-AV, the queue is slightly different)
  • Setting alarms
  • Grouping and ungrouping zones, and seeing how they are grouped
  • Creating Sonos playlists
  • Changing Sonos device configuration (e.g. Line In settings)
As I say, I'll try to address these eventually in this blog, but it's totally subject to my personal time, so it might be a while off.

One area which really could do with some more documentation is Music Services (more recently this includes radio as the radio service is provided by TuneIn). The trouble is many music services have their own API which Sonos hooks into to use them. This API is the property of the music service and,even if they wanted to, Sonos could get into contractual problems with their partner if they publish it. I'll try to illustrate some of these as I go along, but bear in mind I don't have access to all of the music services so I can't cover all of them.

These generally work in roughly the same way:

  1. use music service API to log in and browse music service
  2. Use UPnP to push selected tracks to Sonos queue (as normal)
More recently Sonos have announced a standard integration API for music services. I think this is a smart move as most music services have yet to develop their own streaming API. By providing a free, open API standard which makes it easy for services to build a streaming interface to their systems, and therefore to reach Sonos users, this should escalate the development process and may lead to a more standardised streaming service interface across the industry.
In theory this should make control app development easier, as the interface to control external music service access should be more standard.

Integrating with Sonos

As an active participant in the Sonos Forums, one of the subjects I come across fairly often is the subject of integration.

I should mention that IT integration, in general, is a subject I'm pretty familiar with: one of the major roles I have performed over the years is as a System Architect overseeing the design of operational systems for large telecommunications companies. Such systems usually comprise a hotch-potch of systems operating to different standards, to different data models, driving different operational processes. There is always a mixture of legacy and new. In fact the normal driver for the work I do is the introduction of a new system, new piece of technology, new process, etc. and the need to slot that as seamlessly as possible into the existing mess.

I will also mention that, as part of this work I often have to "roll my sleeves up" and actually get down and dirty with the integration. I am a competent and experienced developer in a range of development languages, including Java/J2EE, PHP and Perl, and have developed integration adapters for a range of systems, APIs, and protocols including XML, SOAP, JMS, HTTP, CORBA, and TMF standards (TMF814, TMF854).

Boasting aside, the point is, I understand this stuff pretty well!

So, being a long time Sonos user, I get pretty interested when the subject of integrating Sonos is discussed. Part of this is because, most of the time, it's spoken about with apparent authority by people who haven't got the first clue about the subject. Many of these are people who think they understand computers because they know how to use one competently. They think that, because they wrote a spreadsheet formula or a DOS batch script once, they are qualified to to authoritatively about professional software development. Worst still some believe such limited experience qualifies them to dismiss the views of highly experienced software professionals.

There are also a lot of industry observers and vested interests who like to put two and two together to create something much larger than 4.

The result is: a lot of bullshit is written about the ability to integrate Sonos with other systems.

I should point out that I do not, nor ever have, worked for Sonos. Nor do I have any financial or other vested interest in the company other than being a happy customer. There are people who try to suggest otherwise: they are bullies and idiots.

What i will attempt to do in this, my personal blog, is to dispel some of these myths. I plan to do this partly by using actual code showing that it is possible to integrate with Sonos using the UPnP API which it largely conforms to and uses.

Note that, at least to start with, the development platform I am using is C on Linux using the GUPnP Libraries which are readily available on a range of systems and languages, and are normally bundled with most Linux distros. It's important to realise that, despite a significant background in software development, I'm not normally a C coder. I would describe myself as adequate at best. Any code examples I post are to illustrate something as clearly as I can. Sometimes (and my lack of skill) this gets in the way of the examples being "good design". On that note I'm happy to take recommendations from others as to how to improve this code, as long as it's not at the expense of the educational value.

It's also important to realise that this is only one of a large number of permutations of hardware platform, OS, development language and library. There are literally dozens more. I can't cover all of them, but the principles I show here should be roughly applicable to other environments.

With that I will dispel a myth:

Myth 1: Sonos needs to release a SDK to support integration

This is a myth for two reasons:

  1. An "SDK" will target a specific environment, and there are two many variations to support even all of the "popular" ones. An SDK which is deigned for, say, Windows using C# and .NET libraries won't be at all useful to an iPhone iOS developer, or to a Mac user. In fact, many of the systems people want to integrate with (for example Russound or Control4) are not based on conventional PCs. Such an SDK would do nothing to support integration with those environments.
  2. There are already plenty of SDKs available. Any UPnP client library is, in fact, a Sonos support library, because the Sonos control protocol is compliant with UPnP. In fact, much of it is compliant with UPnP-AV which means code you develop for many Sonos functions will also control any other UPnP-AV media device. As I said, I'm using GUPnP, but I could easily have used libupnp or Platinum UPnP or Cyberlink and that's just some of the options I have in C. If I was using a different language I would have a different choice.
So, whilst in some circumstances I can certainly see some benefit in Sonos providing some additional development support, it's a nice to have, not a "need". This myth is spread by people who are either too lazy to check facts, ignorant of how software development works, or who just like slagging Sonos off. Sometimes it's all three!

Back to the subject, some of the code on this blog is based on code originally published in the Sonos Forums under the heading "Developer's Diary" (the subject line has now been changed to "Exploring the Sonos control API"). I have decided to focus my attention to posting in this blog for several reasons. Firstly, the Sonos Forums are largely made up of "end-users" whose only interest in development is the possibility they might get some cool free apps. There's nothing wrong with that, but it does mean that there were actually very few people who were genuinely interested in writing code of their own. I actually had someone quite aggressively querying why I was posting such information on the forums, and another couple who used it as an opportunity to make less than complimentary remarks.
Secondly, there wasn't enough space in a post to adequately cover one topic, so I found I was splitting articles up into multiple posts. Also the formatting options available are limited.
Finally, this is my blog, so I can post what I like. On the forums, as a volunteer moderator, I'm the target for all kinds of hate, abuse, and attempts to suppress my views. Forums are, by nature, more public. People join the Sonos forums for community support and discussion. In general they don't want to see fights and so I largely have to toe the line. As such, suggesting people are idiots (even if they plainly are) is likely to start a fight which is not conducive to a pleasant forum. On here, I can say what I like. If you don't like it, go somewhere else.

On that final note, if you're not happy about anything I post here, then here is the place to complain, not on the forums. This blog is deliberately separated from the forums and is nothing whatsoever to do with them. Attempts to use content on this blog to attack me on the forums will not be tolerated. As Wil Wheaton so beautifully put it: "Don't Be A Dick!"