Adding a toolbar to UIImagePickerController

For an iPad app I was working on recently, I wanted to be able to show a toolbar on the bottom of a UIPopoverController which was being used to present a UIImagePickerController. It turned out this was a non-trivial problem to solve.

On iPad you are meant to display the image picker inside a popover. My first attempt was to ignore this and create a new view controller that had a main view and a toolbar. I added the image picker into to the main view and presented the whole thing in a popover:

UIImagePickerController *imagePicker = [UIImagePickerController alloc] init];
ToolbarPickerController *toolbarPicker = [ToolbarPickerController alloc] init];
[toolbarPicker.mainView addSubview:imagePicker.view];

UIPopoverController *popover = [[UIPopoverController alloc] initWithContentViewController:toolbarPicker];

This worked but was unsatisfactory for a few reasons:

  • The popover controller normally makes UINavigationController content look like its part of the popover itself (it styles the navigation bar, and any toolbars, to blend perfectly into the popover border). Because my new view controller was not a UINavigationController, but hosted one instead (UIImagePickerController), the popover did not apply the correct appearance.
  • The image picker thought it was modal and displayed a cancel button. It doesn’t do this normally when displayed in a popover. It was impossible to get rid of the cancel button and it was jarring. I also got a few warnings about a missing style while debugging.

In fact, this had not been my first attempt. Initially I had planned to simply add some toolbar items to the UIImagePickerController but gave up very quickly when this didn’t work. I decided to try again…but a bit harder.

My initial attempt had been to do something like this prior to displaying the popover:

imagePicker.toolbarHidden = NO;
imagePicker.toolbarItems = [NSArray arrayWithObjects…];

This doesn’t work, and its flawed. Firstly, the image picker forcibly hides the toolbar as soon as it presents anything. Secondly, the image picker is a navigation controller and therefore it sets its toolbar items from whatever view controller it is presenting. I wanted a consistent toolbar that would be visible all the time.

It turns out the solution is to show the toolbar and set the toolbar items on the top view controller every time the image picker presents a view controller. This can be achieved by implementing a couple of UINavigationControllerDelegate methods:

- (void)navigationController:(UINavigationController *)navigationController willShowViewController:(UIViewController *)viewController animated:(BOOL)animated
{
  if (navigationController == self.imagePicker)
  {
    [self.imagePicker setToolbarHidden:NO animated:NO];
    [self.imagePicker.topViewController setToolbarItems:self.toolbarItems animated:NO];
  }
}

-(void)navigationController:(UINavigationController *)navigationController didShowViewController:(UIViewController *)viewController animated:(BOOL)animated
{
  if (navigationController == self.imagePicker)
  {
    [self.imagePicker setToolbarHidden:NO animated:NO];
    [self.imagePicker.topViewController setToolbarItems:self.toolbarItems animated:NO];
  }
}

You have to put the code in both methods. If you don’t, some transitions will hide the toolbar and it might not appear when initially displayed.

PJ’s Guitar Tuner

My father-in-law owns a guitar shop. He also plays in several bands. He likes ‘the quo‘. He could be ’the quo’. He spends a lot of his time helping to look after my lovely but sometimes difficult son. He never thinks he is difficult. Only lovely. He never asks for anything and never wants anything. He is impossible to buy birthday presents for.

This is a kind of backstory.

While doing some client work, I became wrapped up in some audio processing stuff, analysing frequencies and microphone input. I was telling ‘grandad’ all about this when it occurred to me I could probably get it to work out the pitch as well.

A pet project was born.

It turned out I was wrong. Well, kinda right and kinda wrong. I did eventually get the pitch stuff nailed but not at all how I though it would work initially. I aslo enlisted the help of a rather good designer friend-of-a-friend who is known for his work here. He did me a nice icon and one of the themes.

So, for fame, fortune and my father-in-law I bring you…

PJ's Guitar Tuner

PJ’s Guitar Tuner!

Coming to an app store near you soon. Available on iPhone and iPod touch.

I’m releasing this app under our new “Powered by Dootrix” brand/thing. Dootrix do serious stuff for a growing number of pretty serious clients…but we are also known to do the odd side project in our ‘spare time’. This is mine. For now.

Converting 8.24 bit samples in CoreAudio on iOS

When working with CoreAudio on iOS many of the sample applications use the iPhones canonical audio format which is 32 bit 8.24 fixed-point audio. This is because it is the hardwares ‘native’ format.

You end up with a buffer of fixed point data, which is a bit of a pain to deal with.

Other libraries and source code tend to work with floating point samples between 0 and +/-1.0 or signed 16 bit integer samples…so this fixed point stuff is a bit of a pain. You could force CoreAudio to give you 16 bit integer samples to start with (which means it does the conversion for you before giving you the audio buffer) or you could do the conversion yourself, as and when you need to. This can be a more efficient way of doing things, depending on your needs.

In this post I want to show you how you can convert the native 8.24 fixed point sample data into 16 bit integer and/or floating point sample data…and give you an explanation of how it works. But first, I need to de-mystify some stuff to do with bits and bytes.

Bit Order != Byte Order

In Objective-C you can think of the bits of a binary number going from left to right. Just as in base 10, the most significant digit is the left most digit

128| 64| 32| 16| 8 | 4 | 2 | 1
-------------------------------
 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0
-------------------------------

The above binary number may represent the integer 66. We can apply bit shift operations to binary numbers such that if I shifted all the bits right ( >>) by 1 place I would have:

128| 64| 32| 16| 8 | 4 | 2 | 1
-------------------------------
 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1
-------------------------------

This might represent an integer value of 33. The left most bit has been newly introduced or padded with 0.

So. When you are thinking about bits, and bit shifting operations, think left to right in terms of significance. Got that? Right, now lets move onto bytes.

The above examples dealt with a single byte (8 bits). When a multi-byte number is represented in a byte array, it can be either little endian or big endian. On Intel, and in terms of CoreAudio, little endian is used. This means the BYTE with the most significance has the highest memory address and the BYTE with the least significance has the lowest memory address (little-end-first = little-endian).

See this post for why this is important when dealing with raw sample data in CoreAudio, and this post on codeproject for a more in-depth explanation. The most important thing to realise is that Bit order and Byte order significance are different beasts. Dont get confused.

For the rest of this post, we are dealing with the representation of the binary digits from the perspective of the language, not the architecture. i.e Think in terms of bit order and not byte order.

Converting 8.24 bit samples to 16 bit integer samples

What does this mean? It means we are going to:

  • Preserve the sign of the sample data (+/- bit)
  • Throw away 8 bits of the 24 bit sample. We assume these bits contain extra precision that we just dont need or are not interested in.
  • Be left with a signed 16 bit sample. A signed 16 bit integer can range from -32,768 to 32,767. This will be the resulting range of our sample.

Remember, we are thinking in terms of bit order; the most significant bit (or the ‘high order’ bit) is the left-most bit. Here is an example of a 32 bit (4 byte), 8.24 fixed point sample:

  8 bits  |         24 bit sample
----------------------------------------------
 11111111 | 01101010 | 00011101 | 11001011
----------------------------------------------

In 8.24 fixed point samples, the first 8 bits represent the sign. They are either all 0 or all 1. The next 24 bits represent the sample data. We want to preserve the sign, but chuck away 8 bits of the sample data to give us a signed 16 bit integer sample.

The trick is to shift the bits 9 places to the right. It’s a crafty move. This is what happens to our 32 bits of data if we shift them right 9 places: 9 bits fall of the end, the sign bits get shunted up and the new bits get padded with zeros such that we get left with:

  new bits  |sign bits|                         gone
------------------------------------------
 00000000 | 0111111 | 10110101 | 00001110    111001011
------------------------------------------
                    |   first 16 bits    |

We still have 32 bits of data with the bits shunted up. We are only interested in the first 16 bits of data (the right most bits) that now contain the most significant bits of the 24 bit sample data. A brilliant side effect is that the first (left-most) bit of the first 16 bits represent the sign!

By casting the resulting 32 bits to a 16 bit signed integer we take the first 16 bits, which are the bits we want, and we have a signed 16 bit sample that ranges from -32,768 to 32,767. If we want this as a floating point value between 0 and 1 we can now simply divide by 32,768. Walla.

The code is thus:

SInt16 sampleInt16 = (SInt16)(originalSample >> 9);
float sampleFloat = sampleInt16 / 32768.0;

Simple when you know how. And why!

 

 

In Praise of ARC

It’s not all the fault of the garbage collector…but I’m growing to love ARC

When I started developing on Windows in the 1990′s, software was fast even though computers were pretty slow. The word ‘Native’ was used to describe an indigenous person and Java was still just a type of coffee.

Somehow this all changed. You could call it progress I guess.

Managed languages, virtual machines, JIT and byte code took over. This seemed to go hand in hand with garbage collection. “Horray” we all shouted. No more memory leaks.

Wrong. They just became harder to find.

There were lots of other advantages though….weren’t there? Well maybe. .Net allowed us to write in a slew of different languages that could talk to each other without having to use shared C libraries. A managed environment protected all the applications from one another and looked after resources more intelligently. So it was all still good. Right?

Frustration. Managed

I’m writing this post to vent a bit of frustration with the promises of byte code VM’s and garbage collection; I’ve fallen out of love with ‘managed’. iOS and Objective-C have shown me another way.

Android’s latest version, Jelly bean, has put the emphasis on delivering a ‘buttery smooth’ user experience. You know, the kind of experience the iPhone has enjoyed for 5 years! Well, now the Java based Android (running on the dalvik VM) has achieved the same thing. 5 years on. Thanks in no small part to huge leaps in its graphics acceleration hardware and a quad core processor!

On Windows, .Net and WPF are slow, hardware hungry beasts. If you want speed, you have to drop down to the native DirectX API’s..and until recently you could not combine these different graphics technologies very easily; Windows suffers from severe ‘air-space‘ issues.

When I started developing for iOS, I was pleasantly surprised by several things:

  • All the different API’s in the graphics stack played nice together.
  • Apps were lightning fast and beautifully smooth with low memory overhead.
  • I found the lack of garbage collection liberating.

[Garbage release]

I did not, and do not, miss the managed environment. Before ARC we had to do reference counting in Objective-C on iOS. I was used to this from my days with COM on Windows but reference counting on iOS made more sense somehow. The rules seemed clearer.

And then the compiler got clever. The compiler. Not a VM.

With the introduction of ARC we dont have to do reference counting. The compiler analyses the code and does it all for us. In the main, it does a fantastic job. The compiler and the tools for developing on iOS manage to produce native code, make it easy to consume C and C++, make reference counting almost invisible, produce sandboxed apps that can’t crash other apps and give me the good grace to use pointers where I see fit without having to declare my code “unsafe” (most of the time anyway)

I still love the Microsoft C# language and the BCL. But as for the whole managed thing? I am happy to leave it behind.

 

Understanding AurioTouch

I have been playing around with CoreAudio on iOS of late. The trouble with media API’s is that they are necessarily complex and CoreAudio is no exception.

While trying to figure out how to read data coming from the microphone and visually render the samples to the screen, I came across the aurioTouch example provided by Apple. It looked great..until I tried to work out what the code was doing!

There are so many aspects of the code that I struggled to make sense of, from arbitrary scaling factors to the odd bit of inline assembly, but here I will mention just one. In hindsight, it doesn’t seem so obscure now. But hindsight is a wonderful thing.

After having obtained a buffer full of PCM audio data, the following code is used to fill an array of values that is used to draw the data:

SInt8 *data_ptr = (SInt8 *)(ioData->mBuffers[0].mData);
for (int i=0; i<numFrames; i++)
{
    if ((i+drawBufferIdx) >= drawBufferLen)
    {
        cycleOscilloscopeLines();
        drawBufferIdx = -i;
    }

    drawBuffers[0][i + drawBufferIdx] = data_ptr[2];
    data_ptr += 4;
}

m_Buffers[0].mData contains an array of SInt32 values. These are PCM samples in 8.24 fixed-point format. This means that nominally, 8 bits of the 32 bits are used to contain the whole number part, and the remaining 24 bits are used to contain the fractional part.

I could not understand why the code was iterating through it using an SInt8 pointer and why, when the actual value was extracted, it was using data_ptr[2]. i.e It was using the third byte of the 32 bit (4 byte) 8.24 fixed point sample and chucking away the rest. I was so confused that I turned to stackoverflow for help. The answer given is spot on the money…but perhaps not all that clear if you are an idiot like me.

After printing out the binary contents of each sample I finally understood.

The code is using an SInt8 pointer because, at the end of the day, it is only interested in a single byte of data in each sample. Once this byte of data has been extracted, data_ptr is advanced by 4 bytes to move it to the beginning of the next complete sample (32 bit, 8.24 fixed point format)

The reason it extracts data_ptr[2] becomes apparent when you look at the binary. What I was failing to appreciate (a school boy error on my part) was that the samples are in little-endian format. This is what a typical sample might look like in memory:

data_ptr[0]     data_ptr[1]     data_ptr[2]    data_ptr[3]
----------------------------------------------------------
 01001100    |   01000000    |   11001111    |  11111111
----------------------------------------------------------

The data is little-endian, meaning the least significant byte has the lowest memory address, and conversely, the most significant byte has the highest memory address. In CoreAudio 8.24 fixed point LPCM data, the first (most significant) 8 bits is used to indicate the sign. They are either all set to zero or all set to one. The sample code ignores this and looks at the most significant byte of the remaining 24 bits…which is data_ptr[2]

It is safe to throw the rest away as it is of little consequence to the display of the signal; throwing the rest of the data away still gives you a ‘good enough’ representation of the sample.

Later on in the sample code (not shown above), this value is divided by 128 to give a value between -1 and 1. It is divided by 128 because an SInt8 can hold a value ranging from -128 to +127

Like I said, this is just one of many confusing bits of code in the sample app. CoreAudio is not for the feint hearted. If you are a novice, like me, then perhaps the aurioTouch sample is not the best place to start!

 

DTRichTextEditor Project Setup

I am using a component for rich text editing in an iPad app we did for one of our clients. We started using it while it was still in early beta and just pulled the code straight into the project.

Now its a bit more mature, I figured it was time to update to the latest and greatest. Having taken a look at the cocoanetics video to figure out how it should be included as a project reference, instead of just a dump of the source code, I thought I’d write down what I did. It may save someone a bit of time if you dont want to spend an hour watching the video.

  1. Grab the latest source from the DTRichTextEditor repository. I then exported a copy to my preferred location in my own souce tree. This gives me a clean copy of the code without all the .svn folders lurking around. In my case this was something like ../ThirdParty/DTRichTextEditor
  2. Open up your project in Xcode and create a Dependencies folder. Right click it and choose Add Files…
  3. Browse to DTRichTextEditor.xcodeproj and click add. Make sure the ‘copy item into destination folder’ option is not checked.
  4. Do the same again but this time browse to DTLoupe.xcodeproj and add this. It can be found in DTRichTextEditor/Core/Externals/DTLoupeView
  5. Click on your build target and go to the build phases tab.
  6. Add the DTRichTextEditor static library and DTLoupe resource bundle to the target dependencies section.
  7. In the link binaries with libraries section, add libDTRichTextEditor.a, libXML2.dynlib and CoreText.framework. There may be some others that you need here as well. This project has been kicking around a while so I cant remember what I had to add initially!
  8. Expand DTLoupe.xcodeproj in the Dependencies folder (or wherever you put it) and drill down into the products folder. You should see a DTLoupe.bundle item. Drag and drop that into the copy bundle resources section.
  9. Nearly there! Now open the build settings tab and find the search paths section. Double click the header search paths item and add in the path to the source code for DTRichTextEditor. In my case this was ../../../ThirdParty/DTRichTextEditor/Core. Ensure you check the box to the left of the path to make the search recursive.
  10. Write your code, build and run.

Disclaimer: I dont know if this is the correct way to do it. It maybe that you need to do things slightly different if you are using some features that I’m not. And, as I was adding it to an existing project, there may well have been some additional setup that I have not covered here.

In any case, this guide will prove useful to me when I forget all of this in a few weeks time. Maybe it will help someone else as well?

 

iBooks for Kids: Inception

What if this book, or at least the images for it, were created completely on the iPad? It would give it a unique kind of flavour wouldn’t it?

Having taken a look at ‘book style apps’ on the App Store and the various types of eBooks available in the iBooks store, I became interested in how you might go about developing one. Having ruled out plain old eBooks I was left with the following two choices:

  1. Develop an app which woud let you completely control the user experience but not be visible in the iBooks store.
  2. Develop an iBooks 2.0 ‘eBook’ and work within the fairly rigid constraints of the authoring tool.

Well, I’ve done apps. And it may well be the better option…but I fancied trying something new.

Now, I am not the greatest artist in the world. But I do like to play around with Sketchbook Pro on my iPad from time to time. This lead me to wonder…

What if this book, or at least the images for it, were created completely on the iPad? It would give it a unique kind of flavour wouldn’t it? And what if we leveraged HTML5 widgetry to create something a bit different to the non-app books that are out there at the moment?

I was mulling this over when something else was bought to my attention. An amazing-musician-and-producer friend of mine had recently recorded a nursery rhyme album…interesting. Now, I thought, that would be cool: A touchy, interactive, sing-along iBook for kids, with illustrations done all on iPad…and all self published.

So. That was that.

I’m currently working on the illustrations for the book. It’s going to be a pet project for a while. Hopefully I’ll finish it before I get distracted or swamped by other stuff.

It could be a fun and unusual journey.

iBooks for Kids: Confusion

A while ago I blogged about a silly little iPad app I wrote for my three year old son, and how it led me to wonder about iBooks.

It struck me that an app that was essentially a book that played sounds, and perhaps had a few animations, would make more sense as an iBook. Especially as we now have access to the iBooks 2.0 platform which supports rich media content and has an easy to use authoring package.

So I wondered. And I looked. And this is what I found out.

Existing iBooks for kids are boring and overpriced.

  • Most stuff already out there uses iBooks pre version 2.0. This means you get a pretty basic eBook with pictures. A very few support sound and even some animation but it is a far from engaging experience.
  • The use of a standard eBook is a bad fit. The pages in kids books are often oddly sized and require lots of space for illustrations; you dont want simple text wrapping. As such, by default you will find that quite often the pages scroll from left to right as you flip the page, which is a pretty bad user experience. You can make both pages of a double page fold fit on the screen but you lose any sense of involvement and the book becomes letter-boxed and lost in the unused screen space.
  • Many publishes offer iBook versions of their paper counterparts. They are digital copies, plain and simple. Considering they offer such a bad experience its surprising that they are actually the same price as the printed media. One book in ‘real-life’ was a touchy-feely cloth book for 0-3 year olds with just 5 pages. Its iBook equivalent was still £4.99 yet its main function (texture) was obviously non existant!

The App Store is not the right place for books

  •  If you are looking for an engaging kids book in the real world you go to a book store, or at least the book section in a toy store. In the world of iPad it makes sense that people should go to the iBooks store but not necessarily the App Store.
  • But the iBooks store is newer than the App Store and its not installed by default. Yet.
  • Because the App Store is just there and because apps offer limitless possibilities for the user experience, many compelling kids books can be found on there. Except they are not books. They are apps. It just isn’t…quite…right.

iBooks 2.0 still isn’t a perfect fit

  • When you start to explore iBooks 2.0 and iBooks Author one thing becomes apparent: It was designed for text books. This should come as no surprise given the marketing push…but for me at least, it did. I was surprised to find that it so rigidly organises everything into chapters and sections. It does this to such an extent that the only kids books that I found that did use iBooks 2.0 have had to work around it.
  • Expecting kids to get the ‘pinch into and out of a chapter’ gesture is unrealistic. At least for small children. As a workaround the books I saw used a single iBooks chapter with a single section and titled various pages in that section “chapter 1″, “chapter 2″ etc.
  • There is no page flip animation. Sad. But probably more practical for tiny hands.
  • Landscape is where it’s at. Although iBooks Author supports both orientations, you are confined to landscape for navigation; portait is an optional extra.
  • There are still relatively few built in ‘widgets’. Because it is geared towards text books, if you want something a bit different you will need to write some code. Specifically you will need to create some HTML5 widgets in dashcode.

Conclusion

iBooks Author for iBooks 2.0 is the best there is. But it is still not as good as developing an app…and you will need to work around several issues. However, as iBooks continues to grow it seems like it should be the place to look for these kind of books that are heavy on rich media and…well, fun kid stuff.

I wonder then. If you want to create a compelling kids book that will be found by parents who are searching for such things: Do you create an app or a iBook? Do you need a developer or a publisher?

I think maybe its time to find out.

 

iBooks for Kids: Conception

I have been using the iPad to read on more and more lately. Maybe it was because of my new found love of the Kindle app, or maybe it was because I just wanted a chance to experiment with UIPageViewController…I’m not sure, but I decided to build my 3 year old his very own app.

He loves anything that makes loud sounds, flashes on and off and generally annoys the hell out of most normal people. In-particular he likes these infuriating books that have a single plastic button and play the same jingle over and over again. They are infuriating, in-part because of the sound, but primarily because the button is a single moulded enclosure such that when the battery dies you have to buy an entire new book!

So, after going through three copies of “row row row your boat”, and not wanting to add, yet again, to the growing pile of cardboard and plastic, I decided to turn all of his books (he has pretty much the whole series) into a single app.

Some scanning, a bit of coding and a few hours later I had an app that let you push the button and turn the pages, all courtesy of UIPageViewController and AVAudioPlayer. And best of all, when the battery runs out you can plug it into the mains and recharge it!

But that was just the start…

It struck me that maybe my approach had been wrong. After all, Apple had recently released the iBooks 2.0 platform with iBooks Author. It was designed specifically for rich media books. So maybe that was a better option? Maybe I could have got it all done with even less effort and produced something with even better results.

I looked into it. I became confused.

I’m going to follow this up with a post very shortly about what I found out and what it lead to. But right now, I need to go and draw a tree, and a spider, and a bird…and about half a dozen other things. I’ll tell you why shortly.