First Month of Vidyo operation as the official collaboration service for the LHC community

This is a summary of the first month of full operation of the Vidyo Service for the LHC community, since becoming the sole provider on January 1st 2013



Usage has dramatically increased from the beginning of 2013 reaching impressive numbers:

- More than 3200 meetings per month

- Peak of 750 simultaneous connections in February

- More than 5000 distinct users and more than 3000 Guests per month

- More than 18000 clients have been installed (mobile and desktop installations included)

and the growth continues…



Together with the growth there were also some issues. A list of them is compiled below together with the appropriate actions taken to address them:


- "Failed to join meeting" message:

Happened in January and was due to the topology of routers not being able to process all the peak time accesses


Reconfiguration of the network topology of routers to be able to handle the increasing number of peak requests. Not seen again after topology reconfiguration.

In addition, we are currently adding more routers on the locations with logged increased traffic (4 more servers serving Europe and US located at CERN and Internet2) in order to avoid that anything similar happens in the future, as the usage increases.

In the background, we are testing the virtual router infrastructure of Vidyo to enable instant massive increase of accesses if at some point that is needed.


- Problems with H323 devices (frequent dropouts)

Action: A number of patches have been put in place and the situation is now much better. There are a few changes still to be made, in order to completely eliminate the issue. 


- Load balancing problems on the CERN IVR

Due to the specific scenario of the CERN VC services, the IVR at CERN has frequent problems on the load balancing of dial-in H323/SIP calls. Note that this affects the CERN IVR only.

The Internet2 IVR is load balanced and should be used for now for dial-in as a work-around (by default if the H323 device is based in the US).


This is being addressed in 2 ways:

1) Creation of a new cluster of Gateways at CERN that will deal with incoming calls only (also extending support for the growing requests of VoIP connections from partner institutes as well as a Skype bridge)

2) Modifications to the Gateways software in order to allow dial-in load balancing when configured with external Gatekeepers (the main reason for the load balancing issues).


- Linux Support:

The situation was significantly improved with the implementation of PULSE support. Packages are also now available through the SLC extras repository. However, there are still some issues. 


Issues are analysed case by case and fixes are being put in place as versions are getting released (specifically distorted audio in some setups and some segmentation faults).

The plan is to have in the short-term generic packages that will cover families of distributions.


Support lines 

There were some issues with the support lines, that have been addressed by a replacement of the entities that provide support.

From Tuesday 19th of February support is done by CERN and Vidyo directly (and only).

Basically, day time in Geneva, user support is done via the Service-Desk and the 2nd line of VC support at CERN.

During the night in Geneva, there is a new line of support reachable via +12014788531 for urgent issues that require immediate assistance. 


New features

The roadmap (published here: is being followed and still on Q1 2013, we should see some new features:

1) Release of a new VidyoDesktop client that supports chat (already on test within some beta testers in the community)

2) Release of a Skype bridge (following up the statement above about the new GW clusters)

3) Extension of phone bridges (already present in on the pipeline are Nikhef, Cesnet, Desy, INFN, CERN and a phone number in France)

4) Recording of Vidyo meetings


End of next quarter let's make another "state of the Union" to analyse the progresses made. :)

Thanks a lot for your time,



You are here