How to ship high value equipment: A complete guide

At CP Cases we supply specialist protective gear across a huge range of industries, and the one thing that they all have in common is that they use highly specialised, sensitive and fragile equipment. From the medical and defence sectors to the broadcasting industry, only the most precisely designed and robustly constructed protection can ensure the consistent high performance of equipment in all settings.

However, the valuable and fragile nature of this equipment means that moving it around the world becomes a major logistical challenge. Here we explain exactly what you need to know and consider when shipping high-value equipment across the globe.

What can go wrong when shipping high-value equipment?

There are a number of different types of transit damage that can occur when high-value equipment is on the move. As soon as the product leaves your possession and is in the hands of the company shipping it, you need to know that it is sufficiently protected against any of these.

  • Shock and impact, even from small bumps, can make highly specialised tools and equipment unusable
  • Particular frequencies of vibration may damage sensitive equipment
  • Static charges can make highly calibrated instruments unusable
  • Any level of moisture, dust or dirt can damage sensitive electronics and may interfere with the circuitry that causes equipment to run effectively
  • Extremes of temperature, humidity and changes in atmospheric pressure may all affect the performance of equipment

Having tailor-made protective cases, climate control units and specialised packaging in place can ensure that your equipment is protected from all of the above, completely sealed in its own microclimate.

Understanding the right type of protective casing for your equipment

We manufacture a huge range of protective cases, some of which come in standard sizes but many of which are designed to specification to meet exact requirements. This can make all the difference in keeping your equipment protected.

  • 19-inch racks are used across a range of industries including defence and security. They allow for the safe transportation of fragile electronic equipment within checked baggage limits.
  • Rugged textile cases can provide tailored padding for unusually shaped equipment, such as surgical tools or engineering gear while maintaining a lightweight
  • Flight cases offer robust, heavy-duty protection for extremely fragile equipment like musical gear, made from plywood and strengthened with aluminium and steel
  • Lightweight aluminium cases are widely used across the broadcast industry and can protect their contents against even severe impact
  • Climate control transit cases can provide all-round protection from harsh weather, dust and dirt for any electronics, with insulation and air conditioning units fitted in order to prevent overheating.

Opting for aluminium casing can be an effective way to keep costs down if you are transporting something long-distance, but this does not necessarily apply for all types of equipment and some may require heavier casing.

Using inner foam to protect your equipment

Foam insert protection is the last line of defence for your equipment if the outside of a case is damaged. Used correctly, it completely isolates the item inside the case, protecting it from a collision with other objects in the same case and insulating it even the smallest vibrations or larger impacts such as that of the case being dropped. These types of outside impacts can damage sensitive equipment beyond repair, so simply placing it inside a case is not enough.

High-quality foam inserts can cope with repeated impact, are rigid enough to fully protect equipment and are soft enough to cushion it. It is also crucial that the foam can be precisely shaped to support every part of the equipment, and that any adverse effect on the equipment such as discolouration is not a possibility.

How to plan for successful shipping of equipment

In addition to the above, there are several practical steps that you can take in order to ensure that equipment is more likely to arrive at its destination in full working order.

  • Undertake detailed and clear analysis of the environments that the equipment will pass through and the hazards it may face. How many times will it be handled?
  • Assess risks which are specific to each item, whether in transit or otherwise
  • Ensure that different types of packaging being used are compatible with one another
  • Ensure, as best you can, that all staff involved in handling the equipment at each stage of the chain are aware of how to protect it correctly
  • Work only with trusted and reputable transport providers
  • Ensure that you have the correct insurance in place

Speak to us today about protecting your equipment

As a renowned worldwide name with over 50 years of experience in the industry, we understand your equipment protection needs and are able to offer unrivalled casing wherever you are shipping it – giving you complete peace of mind. Contact our friendly team by calling 020 8568 1881 and we will be glad to discuss what would best suit your requirements.

29.97 Things You Didn’t Know About Frame Rates

As video editors, cinematographers, and content creators, we spend a remarkable amount of time adjusting our settings. From resolution to codecs to frame rates, there is a lot of history, design, and mathematical precision behind these settings, and learning more about them might help you in your next big project.

So strap on your thinking caps, it’s time to talk frame rates.

Origins on the Silver Screen

There is no single standard for video frame rates (or framerates, frames per second, fps). While the early elite film studios of Thomas Edison’s era often shot their motion pictures between 16 and 24 fps with hand-cranked cameras, the films would be played for audiences anywhere between 20 and 26 fps. Because of this inconsistency, many films of the era seem frantically sped up, often with comically fast character movement.

Frame rate: the number of frames (consecutive still images) in one second of video.

With the addition of sound synchronization and widespread film distribution all over the world, motion picture frame rates were standardized to 24 fps in the late 1920s. As we all know, the rest of video post-production is far less defined.

While there are common frame rate choices for different types of projects, it’s important to know why certain rates became standard, how this affects the way your video plays back, and how editing platforms convert between different frame rates.

Standardizing the Video Frame Rate

In the early days of broadcast television, huge increases in viewership created demand for more high-quality, standardized television programs. The earliest television sets used cathode ray tube (CRT) technology to display video feeds as many vertical and horizontal lines, creating the first standard video resolutions.

Interlaced video is a standard method where the display first updates all even-numbered lines, then updates the odd-numbered lines. Interlaced video splits a 30 fps signal into a 60 half-frames (or “fields”) per second signal, creating images with smoother motion. In true interlaced video, each half-frame displays a static image. This means a ball moving across the screen would be in a different position for each field, or two different positions per frame.

In order to synchronize—and therefore standardize—frame rates for each television set, early systems used the AC power system. The American power grid is a 60 Hz system, which created a standard broadcast television frame rate of 30 fps. In Europe and other places with 50 Hz power systems, television broadcasts use a standard frame rate of 25 fps.

With the addition of color television came new challenges. Standard black and white television signals, known as luminance signals, could be broadcast without issue. A separate chrominance signal, carrying all the information necessary to display a program’s color data on a television screen, required extra bandwidth to transmit.

Building off earlier research, industry leaders developed a system where chrominance information could be encoded into the same signal as luminance information. While this would display color on compatible devices, chrominance information often interfered with the luminance, creating visible dots on non-compatible black and white televisions.

By adjusting the standard television frame rate, the dots would no longer display in the same place on the screen each second. The dots were far less noticeable when they were moving around. For this reason, the standard broadcast frame rate in the United States is approximately 29.97 fps (technically 30,000/1,001), just slightly fewer than the commonly used 30 fps.

Quantifying the Moving Picture

This history is the reason we have so many standards for frame rates and video formats. Because 29.97 fps is so close to 30, many people (and some software) will conflate the two, using the integer for any frame rate close to 30 fps.

Video playback that is slightly too slow or too fast is usually imperceptible, except when synchronizing audio. If a video is two hours long and was recorded at 30 fps, the video contains 216,000 static images. If that video is played back at 29.97 fps, it will be two hours and 7.2 seconds long. By the end, the audio will be 7.2 seconds behind the video, which would obviously be very noticeable.

Another way of looking at it is by counting the number of frames for a certain video length. For example, a 33.333(repeating)-second video at 30 fps will have 1,000 frames, while the same video duration at 29.97 fps would only have 999 frames.

This effect is also seen in the difference between 30,000/1,001 fps and 29.97 fps, although it requires a much longer video. For a video that is 33,366.666(repeating) seconds long (over 9 hours), a 30,000/1001 fps video would contain 999,999 frames, while a 29.97 fps video would contain only 999,998 frames.

Converting Frame Rates

What if your project’s raw footage was filmed at 24 fps, but the video was to be displayed on a 30 fps television broadcast? You would need an additional 6 frames every second. As shown in the images below, additional frames per second with less time between frames creates a smoother moving image.

24 fps with a delay of 200ms.

30 fps with a delay of 160ms.

Duplicating 6 out of each 24 frames in the source video would cause a jerky motion in the final export. In the image below, the loop uses frames from the 24 fps video, but duplicates frame numbers 0, 4, 8, 12, 16, and 20. As you can see, the ball pauses slightly at these intervals.

Here you can see the jerky motion of the ball as it duplicates six frames.

The problem is, there is no perfect way to synthesize the in-between frames. For a 30 fps video, frame number 3 would be displayed at 100ms. For a 24 fps video, however, there is no frame that represents this timecode. The closest times are frame number 2 at 83ms and frame number 3 at 125ms.

As illustrated below, the missing frames can be approximated using adjacent ones, but it’s not always exact.

Frame 2 of the 24 fps video (displayed at 83ms).

Frame 3 of the 24 fps video (125ms).

By blending the two frames together, you can approximate what should be displayed at 100ms. When compared to the actual frame 3 from the 30 fps video below, it’s still not perfectly accurate.

The approximated display of the 24 fps video at 100ms.

The true placement of frame 3 in the 30 fps video at 100ms.

Now that you understand the standard frame rates, how they came to be, and how seemingly small differences (29.97 fps vs. 30 fps, for example) can impact your videos, it’s time to put your new knowledge to use. Try it out for yourself and see if you can recreate some of the sample videos above.

Years of history, calculations, and trial and error have gone into the standardized frame rates we use repeatedly in our work. By learning more about the reasons why these standards exist, we can better understand why our media acts and looks the way that it does. And by understanding the facts and figures behind frame rate conversions, we can easily plan for tricky post-production situations that could stall an otherwise smooth video editing workflow.

Learn more about video frame rate and its effect on timecode in Decoding Timecode Standards in Video Production.

NINJA STREAM: What Atomos’ New Device Is Really For

Presented by Richard Warburton
Shot in house by Global Distribution

Atomos has announced the next generation of their Ninja family: the Ninja Stream and Ninja V+. We’ve been expecting the Ninja V+ for a while – it follows the natural path of product upgrades – but what about the Stream?

Decoding Timecode Standards in Video Production

We’ve all seen it. It’s in the corner of our timelines, the viewfinder of our cameras, and hundreds of other places throughout the production process. But what do you really need to know about timecode in video production?

It’s time to decode the timecode.

Timecode is absolutely critical to video production. It can save time and money in production, and it can cause a lot of problems if not used correctly. As such, it is important to understand how video timecode works, how different frame rates affect it, and how to resolve potential timecode issues that may arise during your production process.

Timecode and Frame Rates

Understanding how frame rates work is important to understanding how timecode is designed.

For example, an editor looking to place manual keyframes on a project every half-second would find it useful to know the exact number of frames each interval needs. Without knowing the frame rate the project was filmed in, a well-versed editor could find this information quickly and make creative decisions based on that information.

Frame rate—the number of still images per second of video—can be tricky to visualize. For cameras, NLEs, and auxiliary functions like audio sync, video runtime must be quantified in a consistent, decodable way. To solve this issue, the standardized timecode was created.

Timecode values are a way of numbering frames in video. Standardized by SMPTE (the Society of Motion Picture and Television Engineers), video timecode is usually represented as hour, minute, second, then frame number each separated by a colon (:). For example, the 6th frame at 1 hour, 23 minutes, and 45 seconds of a video would be displayed as timecode 01:23:45:06.

Understanding Standard Timecode

When dealing with whole number frames per second, this can be easily converted to actual times. For example, in a video with 30 fps, each frame is shown for 33.333(repeating) milliseconds. This means we can determine the actual time at which that frame is displayed.

Similarly, in a video with 24 frames per second, each frame is shown for 41.666(repeating) milliseconds.

For both 24 and 30 frames per second, frame number X within any given second has the same offset as frame number X within a different second (see timecode 00:00:00:01 and 00:00:01:01 above).

This is not true in the case of video with a 30,000/1,001 frame rate (approximately 29.97 fps)—like NTSC television—shown below. As you can see, the actual display time in milliseconds is drifting by about one millisecond per second at this frame rate.

Catching the Drift

This timecode drift for non-integer frame rates grows continuously with longer videos. Below are the timecode values for 10 minutes into the 29.97 fps video. The drift is six-tenths of a second off from the actual display time. In another 10 minutes, there will be an entire second’s difference between the timecode value and the actual display time.

In this particular case, there is a modification called Drop Frame Timecode where some timecode seconds have fewer than 30 frames.

Every minute—except for the tenth minute—two frame numbers are dropped. This might be confusing, but it’s necessary to combat the drift. Drop Frame Timecode typically uses a semicolon as the separator before the frame number, instead of the colon that is used by regular timecode.

Notice that the frame counts 00 and 01 were skipped at the start of minute 01. This brings the timecode value closer to the actual display time for the frames. Each frame is displayed for about 33 milliseconds, so the drift of 0.060 seconds is close to the amount of time needed.

Why does Drop Frame Timecode work? For each 10-minute segment of 29.97 fps video, there are 17,982 frames (10 minutes * 60 seconds per minute * 29.97 frames per second). Ten minutes of video at 30 fps would be 18,000 frames exactly. The difference is 18 frames, explaining why the “drop frame” standard is to drop two frames from 9 of every 10 minutes. The scheme drops 18 timecode frames every 10 minutes, which keeps the timecode value much closer to the actual display time.

It’s worth noting that 10 minutes of video at 30,000/1,001 fps would contain 17,982.018 frames (10 * 60 * 30,000/1,001), so the Drop Frame Timecode does not perfectly model actual television transmission rates.

All in Good Time

Understanding the ins and outs of frame rate and timecode can help head off potential issues with your production, giving you a better chance at a smooth editing process no matter how many cameras, cuts, and complexities you encounter. And when you consider the necessary precision and specificity that go into timecode, as shown above, it can become a lot more complicated than one might expect when starting out.

It’s more than just time. It’s timecode. Decoded.

Want to nerd out some more? Check out our posts on 29.97 Things You Didn’t Know About Frame Rates and 4K Shared Storage—Considerations for Post, Color, VFX, and Archive.

CCC Live Streams Worship Services with EVO, IngeSTore

As anyone who has produced a weekly broadcast will tell you, live productions can be hard work. At Christ Community Chapel (CCC), worship leaders host an online church service for their remote congregation that at once encourages, comforts, and displays world-class production value. These online worship services include live sermons, pre-recorded music performances, multi-channel switching from 8 high-quality SDI sources, and many more complex video streaming workflow elements.

With a heavy workload like theirs, CCC’s production team was struggling to keep everything organized. In their workflow, each camera recorded to either internal memory cards or an external recording deck, while audio was recorded separately by the service’s live audio mixers. After the worship service concluded, audio and video files were exchanged between the two teams using Microsoft OneDrive cloud storage and external SSDs.

With their disconnected church tech, one team always waited on the other to complete their work before they started the next stage of production. Michael Seng, video director at CCC, put it best: “It was very convoluted.”

Christ Community Chapel provides a high-quality live broadcast of each church service for their remote congregation.

Expediting the Video Editing Process

Christ Community Chapel needed a live broadcast solution that saved their production team time and introduced a more collaborative workflow so they could cut multiple versions of their worship service faster week after week. After analyzing several workflow solutions and professional video servers to see what would meet their church live streaming and media storage needs, CCC chose the SNS EVO 16 Bay video editing server and Bluefish444 IngeSTore Server 3G.

From religious institutions with massive audiences to small churches with a devout and dedicated congregation, EVO is the high-performance shared storage solution for houses of worship and creative production studios around the world. With EVO, Seng’s video editing team collaborates on projects without the risk of overwriting project files or cluttering their file structure with dozens of duplicate versions.

CCC uses two IngeSTore Servers along with EVO for multicam acquisition and live edit. Using IngeSTore, 8 SDI feeds are captured live and encoded as either Sony XDCAM or ProRes files, depending on the requirements of each project. As they are encoding, IngeSTore transfers these files directly to EVO over a 10GbE network connection. Seng’s team then begins piecing together over 20 camera angles in a single Premiere Pro timeline, along with graphics and other media overlays on their EVO. Each camera feed is saved for post-production, so the entire process is fast, simple, and stress-free.

Minutes and Seconds Matter

CCC’s production team saves time importing media, syncing audio, and managing files with their new IngeSTore-to-EVO “edit while record” workflow. What was once a laborious multi-step process has been completely streamlined, giving Seng’s team members access to their footage while it is still recording.

“We have multiple editors pulling in footage while the live stream is going on,” said Seng. These growing files—audio and video files that are still being filmed—are quickly accessible on EVO shared storage, letting editors collaborate on productions before the online services are over. “Live-editing saves my video team a couple hours every week. That may not sound like a lot, but in our world, that can be huge.”

“We live in a world where minutes and seconds make a difference, so something that can save hours is a game-changer.”
-Michael Seng, video director at Christ Community Chapel

As if trimming hours from their post-production time and live-editing their online church services wasn’t enough, adopting EVO shared storage has even more advantages for their video production setup. But to better understand these advantages, we’ll need to better understand Seng’s team.

Value for Volunteer Video Teams

From their worship team to the church video production department, Christ Community Chapel relies on volunteers from the congregation to keep their services running smoothly. As the primary editor and leader of the live video production team, Seng had to teach volunteers to edit in Adobe Premiere Pro and use other video equipment and production technologies as needed. To help ensure their workflow is accessible to volunteer editors, Seng relies on EVO’s suite of workflow tools to simplify the post-production process.

Key among those tools is ShareBrowser, the easy-to-use media asset management software included with every EVO. With ShareBrowser, the entire team can search, preview, tag, comment, and share their entire media library. By using the integrated ShareBrowser panel for Premiere Pro, their editors don’t need to bounce back and forth between programs. The ShareBrowser panel provides direct access to their footage and organizational tools, and is contained within their Premiere Pro workspace for convenient media selection and import.

To keep things tidy, Seng created ShareBrowser bins for each team with an additional bin for general stock footage and commonly-used assets. “Wading through files wastes time,” said Seng. “ShareBrowser helps me narrow down our media library to only share the files my team needs.”

Quality Video, Quicker

For a busy church video production team like Seng’s, EVO is an important investment in their future outreach. “I just love that it saves time,” said Seng. “Projects that used to be impossible because of time constraints are no longer impossible.” In fact, Christ Community Chapel has already planned ways to expand the scope of their online worship services to new audiences and offer better remote and on-site services to their loyal congregation. It’s amazing what a better workflow can do for a growing church technology team.

To learn how EVO shared storage solutions can improve creative workflow at your house of worship, schedule an EVO demo today.

Next Generation Imaging Report

Adoption Trends
Next Generation Imaging
Updated May 2021

IABM Media Tech Trends reports annually track the adoption of specific emerging technologies within the broadcast and media sector. The purpose of these reports is to enable member companies to better understand the drivers of emerging technologies’ adoption within customer organizations. This should provide member companies more tools to better address the challenges lying ahead, from new product development to marketing strategy. These reports contain a discussion on the state of adoption of the emerging technology in broadcast and media as well as an analysis of significant customer deployments.

View the interactive report below, or click here to view and download the PDF version.

In Conversation with Rexcel

We are joined by Ren Egawa, CEO and Found of Rexcel Nippon Corporation to discuss live streaming. In a wide ranging discussion, Ren talks to us about democratizing video delivery, creating a performer and audience centric system and enabling technology and art to work together.

Ren also talks us through a few examples of their work including recent projects with Clubhouse.

Bubble Bugle Issue 18

Well, that was an interesting year to say the least! As I said one month into Covidville, there is no rule book for this and all I know is that we need to keep communicating – and that is what we have done, whether that is in work or getting your Granny on Zoom.

Food For Thought: Digital Marketing in 2021

The era of digital marketing was born when the internet was created. Since then there have been waves of change that have washed out inflexible businesses and amplified the forward thinking ones instead.

Let’s take a bird’s eye view at some of the recent, or approaching, changes:

Artificial intelligence (AI)

AI can be used to increase sophistication in many areas of a business, but when it comes to marketing in particular, AI can analyse customer behaviour, identify patterns and use data from social media platforms and blogs to help businesses understand their customers. It is commonly being used across many marketing mediums to forge better results.

5G

In a world gone mad for mobile, one of the most momentous digital marketing trends of 2020 is the emergence of 5G technology. This update creates a new era of communications and its impact will be felt across almost every sector.

Visual search

This has elevated user experience. You can now upload an image to carry out a search and get more specific results, effectively turning your phone’s camera into a search bar.

Google Lens is a visual search engine, which recognises objects and landmarks through a camera app. This also poses a new direction for visual search engine optimisation (SEO).

In addition Pinterest has launched a beta version named Pinterest’s Lens which now recognises 2.5+ billion objects.

Personalisation

To stand out in 2021, businesses need to personalise their marketing – including content, products, emails and much more. Consumers are becoming more and more annoyed with generic advertising and are more likely to engage with a company if it offers personalised experiences.

Firms are realising that personalised, triggered emails based on behaviour are 3x more effective than generic emails.

Privacy laws

The CCPA (California Consumer Privacy Act) and GDPR (General Data Protection Regulation) had many marketers running for the hills. In certain parts of the world, businesses are forced to pull their socks up and take action. As technology advances so will legislation governing privacy.

Marketers have no choice but to play by the new rules or miss out. Marketers will have to earn the contact information of their targets rather than participate in mass marketing.

Video

Numerous businesses state that video has improved their conversation rate, confirming the importance of incorporating video into your digital marketing strategy for 2021.

It has the power to increase confidence in your businesses products or services in a short period of time. It is also by far the most popular way in which customers want to learn about new products.

There are different options for driving higher engagement with video marketing – from a simple video post, to doing a live broadcast on Facebook, Instagram or LinkedIn.

It is very easy to reformat video content. E.g.: If you have a video for your YouTube channel, you could:

  • Transcribe it to create a text version
  • Publish the transcription as a blog under an embedded YouTube video for better rankings
  • Turn the transcription into a standalone blog with a short rewrite
  • Tear off the audio – use it as a podcast episode
  • Use video thumbnails in your email marketing campaigns and the word ‘video’ in subject lines to increase open rates
  • Use live videos for interviews, product demonstrations and behind-the-scenes glimpses of the brand such as ‘life in the office’ or ‘company events’
  • Create personalised video messages rather than make phone calls or send emails. With high-quality phone cameras, this has never been easier

Predictive and augmented analytics

This method utilises predictive analytics by using data, predictive modelling and machine learning to detect patterns and attempt to predict the future.

In terms of digital marketing trends, we will see a lot more of this, as it is being used in progressive lead scoring and branching logic, as well as targeting, segmentation and positioning. It can also shape individual personalisation, which helps marketers improve a customer’s life time value.

Push notifications

Almost twice as many people sign up for web push notifications compared to a newsletter. It is much harder to achieve a newsletter sign-up rate that matches the performance of push notifications.

Six hours is the average time that passes before a recipient opens a newsletter. With web push, the recipient will see the message immediately.

Position ‘zero’ in search engine result pages (SERP)

Voice search has been rising, thus rocking the search engine optimisation boat. A noteworthy example is the escalation of the featured snippet, which appears in the highly sought-after “position zero” in Google’s search engine results pages. This is yet another avenue to explore through SEO spectacles.

Whilst these are just some of the latest digital marketing trends, they only scratch the surface – but with them come an abundance of opportunities for those willing to jump onto the digital bandwagon.

If you are interested in learning more about how Bubble Agency can assist you with a stand-out digital marketing strategy to drive successful results for your business, please email: hello@bubbleagency.com