Robot musicianship is a recently developing field involving the ability of a robotic system to create, improvise or partake in a musical composition or performance. In order to do this, appropriate skills such as hearing, rhythm and technique must be developed. In the case of human musicians, rhythm and technique are developed significantly using hearing. The goal of this study is to develop audio-based musician skills in existing robots, and focuses on a multiple guitar-playing robotic system based on existing industrial-type robotic arms (SCORBOTs). Audio synchronization is proposed for synchronizing the robots, using a separate microphone for each robot. The tempo is chosen by the human operator and is indicated by percussive beats. A timing system has been developed to calculate the tempo from these beats and operate the robots accordingly. Audio-based control is proposed for overcoming dynamic issues and position errors, and will be used in conjunction with mechanical control to do so. The Matlab Toolbox for the Intelitek Scorbot (MTIS) is the main interface for the synchronization and musical performance by the robots. The complete system also includes guitar-placement calibration using a vision system consisting of a camera and the MATLAB camera calibration toolbox, using parameters such as guitar shape, sound-hole location, and string configuration to calibrate the robot musician. This presentation will describe the timing algorithms and discuss the possibilities of using audio-based control together with position control. It will also include several audio-visual demonstrations of the robot system capabilities.