After nearly two years of speculation, guessing, and and endless debate about “what do you think Apple is going to do?” We found out.
Here are some dry facts for starters:
Pure 64bit code. leveraging core technologies such as Coco, OpenCL, Grand Central Despatch, etc… to deliver an application written from the ground up. In terms of the application design, Apple have focused on three core areas: Content Management, Image quality and the timeline experience, and we will look at each of these in turn:
What added to the Apple pixie dust sprinkled all over the presentation was that it was live demo’d by none other than Randy Ubillos – Uber video designer guru, while Peter Steinhauer FCP Architect did the outline intro.
What they have done is delivered a software experience that mixes the best of what so called ‘consumer software’ does (scene detection, people detection, image analysis, audio analysis – all on ingest) with some pretty solid professional features. So yes there are shades of iMovie present, but overall this is new, innovative and exiting.So just a quick sidebar here: This is just a demo, we well know what veils of coolness and tantalising hints can be alluded to in such demo’s, especially in the presence of the Apple faithful. So while this was interesting – it is not a shipping product. We don’t know where the holes are- we have no idea how spending 10 hours a day behind this thing is going to feel. Rumours abound. They did not talk about plug-ins. As a marketing exercise, it was a master class in ninja bitch-slapping your competition.
Content ingest Management
As referred to earlier the ingest process is underpinned by a bunch of tasks that are carried out in the background-
Range based keywording allows keywords to be associated with ranges, and then employed in so called tag collections. Very useful indeed.
There was no talk of tape based ingest, no discussion of monitoring, although they did refer to Apples complete colour management being present from end to end. Please forgive the pictures and the quality – I was off to the side of the hall so didn’t manage to be in poll position- but this will I hope, give you some idea…
We have some video that we will post when we get a connection that is faster than the legendary vegas wireless.
In terms of image formats it was mix and match – the demo timeline was apparently full of mixed formats –all managed within the timeline itself.
The UI is a tough one. it looks very nice at first glance, but I am not sure what customisations are present, which is a key issue with editing UI management. Everyone is different, one persons orange audio track joy, is another’s version of hell. However the spline controls and the overall control design themselves looked good, and the application appeared to be very zippy and positive.
Apple have done a lot of work on the timeline – they showed the concept of the “Magnetic Timeline” where segments and tracks will dynamically allow synced segments to be moved about.If an audio segment is going to overwrite another, it pops it onto another new track. There is no concept of ‘tracks’ on the TL itself. The Precision trimming feature allows classic rolling trim functions, and L&J cuts as the Americans call them, executed a lot more slickly than in the current version. A big move in the right direction.
Apple also showed what we currently know as containers-except that they call then Compound Clips. The timeline segments can be selected and popped into a container collection in the timeline where is appears as a single segment.
Overall my impression is that a lot more appears to be referenced conceptually to DS than to Media Composer. But this is just a first opinion of course.
They also demo’d a concept called Auditioning. Where you use a collection of shots to ‘audition’ which element you want in the final cut- very nice.
Colour balance and correction in the timeline, effects and geometry management- and retiming- All managed from a simple UI and with very elegant simple controls all determined by a selected range in the timeline clip segment. Sadly you are going to have to wait for the video to see how cool that actually is. Grand Central Despatch means that rendering is happening in the background, and cross-hatching that indicates rendering disappears while you are doing other tasks
The final word to the faithful was that this was shipping in June and would cost $300 dollars – downloadable through the App store.
No mention was made of the other applications in the suite. There were more questions than answers I am afraid, and I will expand further over the next day or two…