# Creating Full Backups of your Android Smartphone Without Root Access

In this blog post I will explain how you can create a full backup of your Android smartphone, even if you don’t have root access.

# “Become” a Developer

The first step is to activate the developer options in your system settings menu. To do that, locate the build number in your device info menu. This is usually located somewhere under Settings -> Device Info. Once you located it, you have to tap the build number seven times in order to unlock the developer features of your phone.

# Enable USB Debugging

Once you unlocked the developer mode, there is a new menu in your settings called Developer Options. Find this menu and enable USB Debugging.

# Install the Android Debug Bridge

The next step is to install a tool called adb (Android Debug Bridge). It is part of the Android SDK (Software Developer Kit). The problem is that the current version as of the time of writing (1.0.32) is unusable due to a bug (see bug report). As soon as this bug is fixed, you can dowload the SDK Command Line Tools from the official download page or install a current version using Homebrew.

Update March 9th, 2017: Tested again with version 1.0.36 (Revision 0e9850346394-android), and the backup works. However, an app blocked the backup process. After uninstalling the app, the full backup was successful.

After you installed a working version of adb, you can check the connection between your device and your computer. To do this, you need to locate the adb executable file in your SDK installation or your extracted folder. Open a terminal and type the following command:

/path/to/adb devices

where /path/to/adb must be replaced with the absolute path to the adb executable. Mac users can drag and drop the adb file from the Finder into the terminal in order to insert the complete path automatically. This should result in an output similar to:

List of devices attached
3204226a921e227d    device

# Make the Backup

If your device is connected, you can perform the backup with the following command:

/path/to/adb backup -all -apk -shared

This starts a backup process including all apps (-all), APK archives (-apk) as well as the internal and external storage data (internal memory and SD cards, -shared). If you don’t need all of this data, you can skip the correspondent parameter.

After issuing the command, you will be asked to confirm the backup operation on your smartphone. You can enter a password to encrypt your backup, which I recommend. Note that the backup process can take a long time depending on how much data is persisted on your phone. The process should result in a large file called backup.ab which is stored in the current folder (if you did not navigate to another directory: in your user home folder). If you want the backup to be stored elsewhere, you can supply the parameters -f <path> specifiying the target file path.

If the file is empty or contains only 41 bytes (when using no password) or 549 bytes (when using password) then you have a buggy version installed and need to get an older version (see previous sections).

# Restoring Backups

Backups can be restored using the command

adb restore /path/to/backup.ab

# Installing Python 3 and music21 using Homebrew and pip3

In this blog post I’m going to explain how the Python library music21 can be installed in conjuction with Python 3 and its dependencies matplotlib, numpy and scipy on Mac OS X. It can also be used as a tutorial for installing any other Python libraries/modules as well.

# The Problem

Initially, on my system there were two parallel Python 2 and Pyton 3 installations. The music21 installer chose Python 2 as default installation target. In order to use music21 in conjuction with Python 3, I tried to install it using the command

1 pip3 install music21

which worked fine. However, when I tried to use the plotting capabilities of music21 an error occured due to the missing modules matplotlib, numpy and scipy. When trying to install matplotlib issuing

1 pip3 install matplotlib

the following error occurred:

SystemError: Cannot compile 'Python.h'. Perhaps you need to install python-dev|python-devel.

# Installing Python 3 using Homebrew

My final solution to this problem was to set up a new Python 3 installation using  Homebrew. This is done by installing Homebrew (if you haven’t got it yet):

1 ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" Python 3 is installed using the command 1 brew install python Note: Previously, the formula name was python3 but was renamed to python as Python 3 is now the official default version. The old version can be installed using brew install python@2. If you already have Python 3 installed, Homebrew will not be able to create symlinks to the python binaries since they already exist. To overwrite the existing symlinks (and thus to set the Homebrew Python as default interpreter for your system) you have to execute this command: 1 brew link --overwrite python Now, symlinks to the new python installation are created under /usr/local/bin. By the way: python updates can now be installed by simply executing 1 brew upgrade python # Adjusting the$PATH Variable

In some cases it might be required to tweak the settings of the $PATH environment variable, namely if the old Python 3 installation is still preferred by the system because of a $PATH entry with higher priority. To check if this step is necessary, type:

1 which python3

If the output is /usr/local/bin/python3, you can proceed to the next section. Otherwise, check the contents of your $PATH variable with this command: 1 echo$PATH

which might look like this:

/Library/Frameworks/Python.framework/Versions/3.4/bin:/opt/local/bin:/opt/local/sbin:/opt/local/bin:/opt/local/sbin:/opt/subversion/bin:/sw/bin:/sw/sbin:/opt/local/bin:/opt/local/sbin:/usr/local/bin:

As you may notice, the old interpreter entry /Library/Frameworks/Python.framework/Versions/3.4/bin precedes /usr/local/bin. To give priority to the new Python 3 interpreter, change the order of the paths, ensuring that /usr/local/bin precedes other python paths:

1 export PATH=/usr/local/bin:[more path elements here]

This command changes the $PATH settings for the current shell session only. If you want to make the path adjustments persistent, add the command to the file .bash_profile in your user home folder. It is also possible to reuse the current value of the variable: 1 export PATH="/usr/local/bin:$PATH"

After that, our Python 3 installed with Homebrew should now be the default system interpreter. Verify this with

1 which pip3

which should echo /usr/local/bin/pip3, which is in turn a symlink to the Homebrew cellar (the place where Homebrew installs modules/packages).

# Installing the Dependencies

Now you should be able to install music21 and the dependencies using pip3:

1 2 3 pip3 install music21 pip3 install matplotlib # this will install numpy automatically pip3 install scipy

I hope this will help you to install a clean music21 environment. No go have fun with musical analysis and plotting 🙂

# Library Bundles for your Xtext DSL

Xtext offers powerful cross-reference mechanisms to resolve elements inside the current resource or even from imported resources. External resources can either be imported explicitly or implicitly. My goal was to provide a library for my DSL containing a number of elements which should be referencable from “everywhere”. Consequently the “header” files from my library must be imported implicitly on a global level.

# Creating a Library Bundle

To create a bundle for your library, use the New Project Wizard in Eclipse and choose Plug-In Development -> Plug-In Project. In this way as OSGi-based bundle will be created. Create a new folder for your resources to be imported globally (e.g. headers) and copy your header files (written in your DSL) to this folder.

# Registering Implicit Imports in your DSL Bundle

Global implicit import behaviour is achievable using a custom ImportUriGlobalScopeProvider. Create your class in your DSL bundle in the package <your DSL package prefix>.dsl.scoping and extend org.eclipse.xtext.scoping.impl.ImportUriGlobalScopeProvider:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 import java.util.LinkedHashSet; import org.eclipse.emf.common.util.URI; import org.eclipse.emf.ecore.resource.Resource; import org.eclipse.xtext.scoping.impl.ImportUriGlobalScopeProvider;   public class MyDSLImportURIGlobalScopeProvider extends ImportUriGlobalScopeProvider {     public static final URI HEADER_URI = URI.createURI("platform:/plugin/my.dsl.library/headers/myHeader.ext");          @Override     protected LinkedHashSet<URI> getImportedUris(Resource resource)     {         LinkedHashSet<URI> importedURIs = super.getImportedUris(resource);         importedURIs.add(HEADER_URI);         return importedURIs;     }   }

The method getImportedUris() is overwritten and extends the set of imports retrieved from the super implementation. Note that I used a platform:/plugin URI here, which is actually resolved sucessfully in an Eclipse Runtime Workspace. In the itemis blog article about global implicit imports classpath-based URIs are used. A disadvantage of classpath-based library resolving is that it does not work out-of-the-box. In fact you have to create a plug-in project in the runtime workspace and add a dependency to your library bundle in MANIFEST.MF manually. I successfully avoided this problem by using platform:/plugin URIs which are resolved as soon as the library bundle is present in the run configuration of the runtime Eclipse instance.

Now your global scope provider has to be bound in the runtime module of your workspace. Open MyDSLRuntimeModule and add the following binding:

1 2 3 4 5 @Override public Class<? extends IGlobalScopeProvider> bindIGlobalScopeProvider() {     return MyDSLImportURIGlobalScopeProvider.class; }

If you start your runtime eclipse with the library bundle now, your global implicit imports should be resolved in the editor for your DSL.

# Resolving Implicit Global Imports in Standalone Mode

If you rely on the global implicit imports in standalone mode (e.g. in unit tests executed in your development Eclipse instance) the platform:/plugin URIs can not be resolved. But this can easily be fixed by using URI mappings in a ResourceSet. The following example shows how to create a standalone parser by injecting a ResourceSet and creating an URI mapping:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 public class StandaloneXtextParser {         @Inject     private XtextResourceSet resourceSet;   private boolean initialized;          public StandaloneXtextParser() {         super();     }   public EObject parse(URI uri) throws IOException { initializeIfApplicable(); Resource resource = resourceSet.getResource(uri, true); return resource.getContents().get(0); }   private void initializeIfApplicable() { // Note: this is not the most elegant way, just for demonstration purposes // Just make sure that setupParser() is called once before you parse if(!initialized) { setupParser(); initialized = true; } }       protected void setupParser() {         resourceSet.addLoadOption(XtextResource.OPTION_RESOLVE_ALL, Boolean.TRUE);         registerURIMappingsForImplicitImports(resourceSet);     }          private static void registerURIMappingsForImplicitImports(XtextResourceSet resourceSet)     {         final URIConverter uriConverter = resourceSet.getURIConverter();         final Map<URI, URI> uriMap = uriConverter.getURIMap();         registerPlatformToFileURIMapping(MyDSLImportURIGlobalScopeProvider.HEADER_URI, uriMap);     }       private static void registerPlatformToFileURIMapping(URI uri, Map<URI, URI> uriMap)     {         final URI fileURI = createFileURIForHeaderFile(uri);         final File file = new File(fileURI.toFileString());         Preconditions.checkArgument(file.exists());         uriMap.put(uri, fileURI);              }       private static URI createFileURIForHeaderFile(URI uri)     {         return URI.createFileURI(deriveFilePathFromURI(uri));     }       private static String deriveFilePathFromURI(URI uri)     {         return ".." + uri.path().substring(7);     } }

The trick is to add an URI mapping which resolves the platform:/plugin URI to a relative file URI. Then the library resources are resolved by means of a relative path in the workspace and the global implicit imports can also be used in unit tests. For more information on standalone parsing please read this blog article.

Other useful resources:

# Displaying Tracks of your Music Library Filtered by Bit Rate

If you prefer managing your music in the form of audio files on your computer, your collection has probably grown over the past few years and at the same time encoding standards have improved and expectations of sound quality have risen. In most cases, the contained audio files have different sound qualities regarding their bit rates. My motivation was to display a list of files in my music collection which have a bit rate equal to or higher than 256 kbit/s.

To achieve this, I looked for command line tools that display metadata for audio files including their bit rate. For Mac OS X I found the command afinfo which works out of the box:

afinfo myFile.mp3 | grep "bit rate"

The output looks similar to this:

bit rate: 320000 bits per second

If you are using another operating system, you can check for the following commands and/or install them:

1 2 3 4 5 file <fileName> (Ubuntu) mp3info -r a -p "%f %r\n" <fileName> mediainfo <fileName> | grep "Bit rate" exiftool -AudioBitrate <fileName> mpg123 -t <fileName>

My goal was to develop a program that crawls through my whole audio collection, checks the bit rate for every file and outputs a list containing only files with high bit rate. I wrote a Python script which does exactly that:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 ''' Created on 25.04.2015   @author: dave '''   import sys import os import subprocess import re   # console command to display properties about your media files # afinfo works on Mac OS X, change for other operating systems infoConsoleCommand = 'afinfo'   # regular expression to extract the bit rate from the output of the program pattern = re.compile('(.*)bit rate: ([0-9]+) bits per second(.*)', re.DOTALL)   def filterFile(path): ''' Executes the configured info program to output properties of a media file. Grabs the output, filters the bit rate via a regular expression and displays the bit rate and the file path in case the bit rate is >= 256k Returns True in case the file has a high bit rate, False otherwise ''' process = subprocess.Popen([infoConsoleCommand, path], stdout=subprocess.PIPE, stderr=subprocess.PIPE) out, err = process.communicate() match = pattern.match(str(out)) if match != None: bitRateString = match.group(2) bitRate = int(bitRateString) if bitRate >= 256000: print("bit rate",bitRate,":",path) return True return False   def scanFolder(rootFolder): ''' Recursively crawls through the files of the given root folder ''' numFiles = 0 numFilesFiltered = 0 for root, subFolders, files in os.walk(rootFolder): for file in files: numFiles = numFiles + 1 path = os.path.join(root,file) if filterFile(path): numFilesFiltered = numFilesFiltered + 1 print("Scanned {} files from which {} were filtered.".format(numFiles, numFilesFiltered))   # main program if len(sys.argv) != 2: print("Usage: MP3ByBitrateFilter ") sys.exit(1)   rootFolder = sys.argv[1] scanFolder(rootFolder)

The root folder of your music library can be given as command line argument. The programs walks through the folder recursively and executes the command line program to display the bit rate in a separate process. It grabs the output and filters the bit rate from the output using a regular expression. The bit rate and the path of the file are displayed in case the bit rate is >= 256 kbit/s. A summary is also displayed showing the total number of files and number of filtered files.

Of course you can extend the filter criteria by adjusting the script to extract other information than the bit rate from the info command.

# Including DOT Graphs as PostScript or PDF files in LaTeX documents

Graphviz is a great open source graph visualization software. In this post I’ll demonstrate how to include DOT graphs in PDF documents using LaTeX.

The most convenient way to insert graphics into PDF documents is to use vector graphics. In contrast to raster graphics (which are pixel-based) vector graphics can be scaled arbitrarily without getting “pixelated”.

From DOT the graphics can be exported in vector graphics as Portable Document Format (PDF) or Encapsulated Post Script (EPS) files. The file type you need depends on the LaTeX compiler you are using. The standard latex compiler can handle EPS files, whereas pdflatex supports PDF graphics.

# Option 1: Using latex and EPS graphics

Assuming you have persisted your graph in a dot file, say myGraph.dot, you can convert it to an Encapsulated Post Script (EPS) file by using the dot tool from the command line:

1 dot -Tps2 myGraph.dot -o myGraph.eps

Here is a nice shell script which converts all dot files in the current directory to eps files:

1 2 3 4 5 6 7 8 #!/bin/bash for f in *.dot; do   basename=$(basename "$f")   filename=${basename%.*} command="dot -Tps2$f -o ${filename}.eps"; echo$command;   $command; done To use it, simply save it as an .sh file, execute chmod 755 on it and execute it in the terminal (works for unix-based systems only). Having done that, we try to insert that image in our pdf file: 1 2 3 4 5 6 7 \begin{figure}[htp] \begin{center} \includegraphics[width=0.9\textwidth]{myGraph} \caption{My Caption} \label{fig:myGraph} \end{center} \end{figure} Note that the file extension is ommited. Unfortunately, the pdflatex compiler does not support EPS images. But don’t worry, there are two options for solving that. # Option 2: Using pdflatex and PDF graphics If you are using pdflatex, PDF graphics have to be produced instead of EPS graphics. This is done as described in the previous section, except another target file type is specified: 1 dot -Tpdf myGraph.dot -o myGraph.pdf # Option 3: Using pdflatex and converting EPS to PDF on the fly The epstopdf LaTeX package converts your EPS files to PDF files on the fly, which then can be included in your document. This is how you must configure your graphic packages in the preamble of your LaTeX document (code from this blog post): 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 \newif\ifpdf \ifx\pdfoutput\undefined \pdffalse \else \pdfoutput=1 \pdftrue \fi \ifpdf \usepackage{graphicx} \usepackage{epstopdf} \DeclareGraphicsRule{.eps}{pdf}{.pdf}{epstopdf #1} \pdfcompresslevel=9 \else \usepackage{graphicx} \fi The epstopdf package will automatically create a PDF file of your figure and include it in your document. However, the following requirements must be fulfilled: The -shell-escape option must be set when invoking the compiler. The command line setting for this invocation can be set in your LaTeX IDE. Here I will illustrate how to configure this setting using the TeXlipse IDE for Eclipse: Open Preferences -> Texlipse -> Builder Settings, select the PdfLatex program and click Edit. The following dialog appears, in which you can add the -shell-escape option to the command line: Another requirement (which is specific to the TeXlipse environment) is the PATH variable, which has to be set. Otherwise, the shell commands called from LaTeX can not be executed correctly. Check out your system path by opening a terminal and typing: 1 echo$PATH

Copy the output to the clipboard, and in TeXlipse go to Preferences -> TeXlipse -> Builder Settings -> Environment. Click Add and create a new environment variable named PATH with the value of your system path, which you just copied from the terminal:

After that, the EPS-to-PDF conversion should work on the fly. If it does not work for some reason, you can manually invoke the conversion from the command line:

1 epstopdf myGraph.eps

You can now publish PDF documents with infinitely-scalable graphs 🙂

# Dividing Git Repositories

I’ve been working in a single git repository with many projects (subfolders) and needed to divide it by moving some folders (including their history) to a new repository, thereby removing these files (with history) from the original repository. There are a number of posts on this, however for my case none worked exactly as indicated. I rather had to combine instructions from various sources, that’s why I decided to write a summarizing blog post here.

# Short Instructions

Basically, you have to perform these steps:

1. Make a backup
2. Clone the source repository
3. Execute git subtree commands on all folders to be moved
4. Create a new repository
5. Pull the isolated folders into the new repository and move them to subdirectories
6. Restore the original repository
7. Delete all moved folders (including history) from the original repository

# Detailed Instructions

First of all: Make a backup. Seriously, these operations will rewrite your git history and can be destructive, so make sure everything is stored elsewhere.

The first step is to create a new working copy of the existing repository.

1 git clone git://my-server.tld/my-repo.git

Imagine the following git repository with the following folders on top level:

A
B
C
D

Let’s say we want to move A and B to another, new git repository and leave C and D in the original repository. Therefore we invoke git subtree split – a magic command that will create a branch containing only the history of a given subdirectory. The syntax is:

1 git subtree split -P <directory> -b <branch>

In our example, we isolate folder A on branch onlyA and folder B on branch onlyB:

1 2 git subtree split -P A -b onlyA git subtree split -P B -b onlyB

The source repository is now prepared. The next step is to create the new target repository:

1 2 3 4 cd .. mkdir newrepo cd newrepo git init

Now we pull a branch from the original repository:

1 git pull /path/to/original/repository onlyA

The path to the original repository must be an absolute path, at least relative paths did not work for me. When listing the directory content you will notice that everything was imported to the root of your repository. If you want the merged content to be in a subfolder again, you need to create it and move the files using git mv commands manually. Dont’t forget to commit your moved files. Don’t forget to move hidden files also! These can be discovered using ls -la.

1 2 3 4 5 mkdir A git mv file1 A git mv file2 A ... git commit -m "Merged A and moved to subfolder"

Note: Apparently it is not possible to move the files in the original repository and then use the split command. When doing this, my history got lost. For me it only worked the other way round (first split, then move the files in the target repository). Feel free to comment if you can explain this behaviour.

Repeat this process for all other folders to be moved:

1 2 3 4 5 6 git pull /path/to/original/repository onlyB mkdir B git mv file1 B git mv file2 B ... git commit -m "Merged B and moved to subfolder"

After that we restore the original repository completely (either using git clone or from a backup). Then we remove all moved folders (including history) from that repository. This is achieved using git filter-branch commands:

1 2 git filter-branch -f --tree-filter 'rm -rf A' HEAD git filter-branch -f --tree-filter 'rm -rf B' HEAD

That’s it! We have now moved a part of the original repository (including history) to a new repository and removed that part (including history) from the original repository. Effectively this divided the original repository into two independent repositories.

# Resources

Here are some links that might be helpful for details and performance optimized operations:

# Defining Custom Language Templates for LaTeX Listings

The listings package enables LaTeX users to include source code snippets into LaTeX documents. The source code can be formatted using predefined languages templates. To select a predefined language, use the following code (as an example, Python is used):

\usepackage{listings} \lstset{language=Python} \lstinputlisting{path/to/myFile.py}

However, it is also possible to define own formatting templates for custom languages, which can be useful for developers of Domain-specific Languages (DSLs), for example. This is accomplished by using the command \lstdefinelanguage. The following example defines a language named myLang with various keywords, single line comments prefixed with //, multiline comments delimited by /* and */ and Strings delimited with double quotes:

% Define Language \lstdefinelanguage{myLang} {   % list of keywords   morekeywords={     import,     if,     while,     for   },   sensitive=false, % keywords are not case-sensitive   morecomment=[l]{//}, % l is for line comment   morecomment=[s]{/*}{*/}, % s is for start and end delimiter   morestring=[b]" % defines that strings are enclosed in double quotes }

Subsequently you can use the \lstset command to activate your language template for all following listings. All parameters are documented inline. This results in nice listings with line numbers, borders with rounded corners and syntax highlighting for comments, keywords and strings. The colors are based on the Java highlighting style in Eclipse:

% Define Colors \usepackage{color} \definecolor{eclipseBlue}{RGB}{42,0.0,255} \definecolor{eclipseGreen}{RGB}{63,127,95} \definecolor{eclipsePurple}{RGB}{127,0,85}   % Set Language \lstset{   language={myLang},   basicstyle=\small\ttfamily, % Global Code Style   captionpos=b, % Position of the Caption (t for top, b for bottom)   extendedchars=true, % Allows 256 instead of 128 ASCII characters   tabsize=2, % number of spaces indented when discovering a tab   columns=fixed, % make all characters equal width   keepspaces=true, % does not ignore spaces to fit width, convert tabs to spaces   showstringspaces=false, % lets spaces in strings appear as real spaces   breaklines=true, % wrap lines if they don't fit   frame=trbl, % draw a frame at the top, right, left and bottom of the listing   frameround=tttt, % make the frame round at all four corners   framesep=4pt, % quarter circle size of the round corners   numbers=left, % show line numbers at the left   numberstyle=\tiny\ttfamily, % style of the line numbers   commentstyle=\color{eclipseGreen}, % style of comments   keywordstyle=\color{eclipsePurple}, % style of keywords   stringstyle=\color{eclipseBlue}, % style of strings }

That’s it. Now you can include your listings in your custom language using

\lstinputlisting{path/to/myFile.ext}

Now have fun with your language and LaTeX 🙂

# How to Install Custom LaTeX Extensions with MacTeX

When writing articles for conferences, authors are often supplied with custom layout templates, classes, bibliography styles etc. for LaTeX. The most common file extensions of these files are:

• .cls – Class files to add new document classes
• .bst – BibTeX Style files to format your bibliography
• .sty – Style files containing LaTeX macros

Using those files documents are created according to the style guidelines of the conference proceedings / journals the article will be published in. When using MacTeX on Mac OS X, the proper location for custom extensions is:

~/Library/texmf/tex/latex/

where ~ ist your user home folder. I particularly needed this to install the LaTeX template for Springer Lecture Notes in Computer Science (LNCS), which are available here. After copying the whole llncs folder into ~/Library/texmf/tex/latex/, I could use the class in my LaTeX environment like this:

\documentclass{llncs}

Happy publishing 🙂

# Creating Audio Unit Plugins with SuperCollider

In this post I’ll show you how to build Audio Unit Plugins programmed in SuperCollider using the AudioUnitBuilder. First of all, I want to refer to this excellent Tutorial, in which Abel Domingues explains in great detail how the whole Audio Unit architecture works and how the SuperCollider AU works. I would advise you to read this first, especially if you want detailed background knowledge. However, some things were not working for me out of the box and therefore I decided to share some problem solutions here.

My instructions were created with the following Setup:

• Mac OS X Mountain Lion (10.8.4)
• SuperCollider 3.5.1
• Xcode 4.6.3 (4H1503)

# Install Dependencies

First of all, the following software prerequisites need to be installed:

2. Download Xcode 4. This requires an Apple Developer Account. Unfortunately, the file is very large (1.7 GB). Install it and open it (make sure you don’t confuse it with your old Xcode, if you had one before)
3. In Xcode, click on the Main Application Menu and go to Open Developer Tool -> More Developer Tools. Alternatively, you can go directly to this page. Download the Command Line Tools for Xcode and install them.
4. Download the SuperCollider AU from here. It has to be extracted to ~/Library/Audio/Plug-Ins/Components.
5. Open your SuperCollider and execute Quarks.gui. Install the AudioUnitBuilder Quark (you need to have an internet connection and SVN installed for that).

# Adjust Paths in the AudioUnitBuilder Code

The current AudioUnitBuilder refers to some old paths, which have changed. In particular, the Rez executable is now in another location, as well as the Server Plugins. Both are now packaged into an .app Container. Open the AudioUnitBuilder Source Code (mark the classname and press Cmd+Y). Inside the file you see the classvar

1 classvar <>rez="/Developer/Tools/Rez"

which has to be changed to

1 classvar <>rez="/Applications/Xcode.app/Contents/Developer/Tools/Rez"

Next, look for the method copyPlugins (in the lower third of the class). The line

1 2 cmd = "grep -e _"++ugens.asSequenceableCollection.join("_ -e _") + "_ plugins/*.scx";

must be changed to

1 2 cmd = "grep  -e _"++ugens.asSequenceableCollection.join("_ -e _") + "_ SuperCollider.app/Contents/Resources/plugins/*.scx";

Don’t forget to recompile your library (Cmd+K) before you proceed.

# Open An Example AU Specification

The AudioUnitBuilder comes with an example file called fedDelay.rtf. You find it in the Quark Installation Directory, which is ~/Application Support/SuperCollider/quarks/AudioUnitBuilder/examples. Open the file in SuperCollider.

To use the AudioUnitBuilder, you’ll need to specify the following:

• Name of your Audio Unit
• Component Type (\aumu, \aumx, \augn or \aufx, which stands for Instruments, Mixers, Generators or Effects, respectively)
• Component Subtype (A four letter code identifying your plugin)
• specs: A two-dimensional array specifying the parameters for your plugin
• func: the function implementing the actual signal processing

# Building the Audio Unit

At the end of the example file, you see the simple instructions needed to build your AU using the AudioUnitBuilder:

1 2 builder = AudioUnitBuilder.new(name, componentSubtype,func, specs, componentType); builder.makeInstall;

Basically the things we have specified are passed in the constructor, and then the method makeInstall does the magic. Hm, but what does it do actually? It is a good idea to look into the source code to understand it. I will try to summarize it here shortly:

1. The contents of SuperColliderAU.component are copied recursively to <yourPluginName>.component
2. All plugins (Ugens) are deleted
3. XML Property list files are created to configure the server running inside the AU and your plugin
4. A Resource file is created based on a template (which is SuperColliderAU.rsrc in the quark directory). Some placeholders in that template are replaced using the sed Unix command. The resource is processed with the Rez tool delivered with Xcode.
5. The DSP function you specified is compiled as SynthDef.
6. The compiled SynthDef is copied to the synthdefs folder inside the AU.
7. Plugins (UGens compiled as .scx files) needed for signal processing are copied from your SuperCollider installation to the AU.

To let the AudioUnitBuilder do all those things for you, you simply need to select the whole file and execute it. If you are lucky, you will get an output like

Created ~/Library/Audio/Plug-Ins/Components/fedDelay.component
an AudioUnitBuilder

But if something goes wrong, here are some of my solutions and debugging tips:

• I got “Error running Rez” on the console which I traced down to an inclusion problem in the resource file. If you open  SuperColliderAU.rsrc file in a text editor, you’ll see the inclusion #include <CoreServices/CoreServices.r> which could not be resolved. The solution is to install the Xcode Command Line Tools, as described above.
• You can debug the commands executed in the Builder by changing .systemCmd calls to .postln.systemCmd.
• You can inspect the files created during the building process by changing mv commands to cp commands. This way all files can be reviewed in the quarks/AudioUnitBuilder directory.
• Check the contents of the .component target file by browsing it in the Finder (right click -> Show Package Contents).
• My working Audio Unit has a size of about 2 MB. If it is significantly smaller, that you might forgot to put the SuperColliderAU.component as skeleton AU into your Plug-Ins/Components folder.

The working AU should contain

• Contents/MacOS/SuperColliderAU
• Resources/plugins/(some .scx files)
• Resources/pluginSpec.plist
• Resources/serverConfig.plist
• Resources/SuperColliderAU.rsrc
• Resources/synthdefs/fedDelay.scsyndef

Alright? Then let’s validate our AudioUnit.

# Audio Unit Validation

Apple developed a tool called auval to validate Audio Units. This is very useful for debugging, as you see error message from the SuperCollider server during the validation process.

Open a terminal window and execute

1 auval -v aufx FEDL SCAU

This will start the validation process for the plugin identified by a triple. The three parameters are component type, component subtype and manufacturer. Read the output if you find any error messages like “SynthDef not found”. If this is the case, then you most likely used the wrong SuperCollider version. To be compatible with the server shipped in the AU, the AudioUnitBuilder must be executed with SuperColldier 3.5.1.

# Try Out Your Audio Unit

If validation passed successfully, you can use your AU in any AU-compatible audio environment. For example you can download the Audio Tools for Xcode from the Apple Developer page. It contains an App called AU Lab, which you can use to test your plugin. Here is how you insert the AU as effect:

And this is the generated UI to control the plugin:

So this is it. I hope you got your AU running and have lots of fun coding you own AUs with SuperCollider 🙂

# Standalone Parsing with Xtext

Xtext is an awesome framework to create your own domain specific languages (DSLs). Providing a grammar, Xtext will create your data model, Lexer, Parser and even a powerful Editor integrated into an Eclipse IDE including syntax highlighting and auto completion 🙂

In this blog post I want to summarize some of my experiences with the behaviour of Xtext using different parsing approaches. These are useful if you want to parse input with Xtext in standalone applications or in the Eclipse context in order to get a model representation of your DSL code.

# Approach 1: Injecting an IParser instance

The first approach uses the IParser interface. In a standalone application (that means if your code is not running in an Eclipse/Equinox environment), a parser instance can be retrieved using the injector returned by an instance of <MyDSL>StandaloneSetup:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 public class XtextParser {   @Inject private IParser parser;   public XtextParser() { setupParser(); }   private void setupParser() { Injector injector = new MyDSLStandaloneSetup().createInjectorAndDoEMFRegistration(); injector.injectMembers(this); }   /** * Parses data provided by an input reader using Xtext and returns the root node of the resulting object tree. * @param reader Input reader * @return root object node * @throws IOException when errors occur during the parsing process */ public EObject parse(Reader reader) throws IOException { IParseResult result = parser.parse(reader); if(result.hasSyntaxErrors()) { throw new ParseException("Provided input contains syntax errors."); } return result.getRootASTElement(); } }

Using this approach, your parse result can be retrieved with only very few lines of code. However, it only works in standalone applications. If you execute this code in the Eclipse context, the following errror is logged:

java.lang.IllegalStateException: Passed org.eclipse.xtext.builder.clustering.CurrentDescriptions not of type org.eclipse.xtext.resource.impl.ResourceSetBasedResourceDescriptions
at org.eclipse.xtext.resource.containers.ResourceSetBasedAllContainersStateProvider.get(ResourceSetBasedAllContainersStateProvider.java:35)

To resolve this, the injector has to be created differently, using Guice.createInjector and the Module of your language:

1 Injector injector = Guice.createInjector(new MyDSLRuntimeModule());

Now the parser works fine, even in Eclipse. But if you use references to other resources or import mechanisms, you will find that the references to other resources can not be resolved. That’s why you need a resource to parse Xtext input properly.

# Approach 2: Using an XtextResourceSet

To parse input using resources, you inject an XtextResourceSet and create a resource inside the ResourceSet. There are two ways to specify the input:

1. an InputStream
2. an URI specifying the location of a resource

In my implementation there are two methods for those two alternatives, respectively:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 public class XtextParser {   @Inject private XtextResourceSet resourceSet;   public XtextParser() { setupParser(); }   private void setupParser() { new org.eclipse.emf.mwe.utils.StandaloneSetup().setPlatformUri("../"); Injector injector = Guice.createInjector(new MyDSLRuntimeModule()); injector.injectMembers(this); resourceSet.addLoadOption(XtextResource.OPTION_RESOLVE_ALL, Boolean.TRUE); }   /** * Parses an input stream and returns the resulting object tree root element. * @param in Input Stream * @return Root model object * @throws IOException When and I/O related parser error occurs */ public EObject parse(InputStream in) throws IOException { Resource resource = resourceSet.createResource(URI.createURI("dummy:/inmemory.ext")); resource.load(in, resourceSet.getLoadOptions()); return resource.getContents().get(0); }   /** * Parses a resource specified by an URI and returns the resulting object tree root element. * @param uri URI of resource to be parsed * @return Root model object */ public EObject parse(URI uri) { Resource resource = resourceSet.getResource(uri, true); return resource.getContents().get(0); }   }`

In both cases, the resource set is injected using Guice. Also, the Eclipse platform path is initialized using new org.eclipse.emf.mwe.utils.StandaloneSetup().setPlatformUri(“../”). The load option RESOLVE_ALL is added to the resource set.

If an InputStream is provided, the underlying resource is a dummy resource created in the ResourceSet. Make sure that the file extension matches the one of your DSL.

In case of a given resource URI, your resource can be parsed directly using resourceSet.getResource(). Using this approach, all references (even to imported / other referenced resources) will be resolved.

# The Dependency Injection Issue

Still there is another problem because we use Dependency Injection here in a way which is not exactly elegant. The point is that a class should not need to care about how its members are injected. In a well-designed DI-based application, there is only one injection call and all members are intantiated recursively from “outside the class”. To learn more about this issue, please read this excellent blog post by Jan Köhnlein.

I hope this post was useful to you. Please feel free to share your thoughts / ideas for improvements.