Google’s Search Generative Experience (SGE) transforms users’ search experience through generative AI. SGE enables users to ask more detailed questions in search, receive concise summaries of their queries, and have the option to engage in conversational follow-up queries. Currently, SGE is still in Beta and is being rolled out in multiple countries through Google Search Labs.

This article will look at how SGE is trained, focusing mainly on Google’s “Generative Summaries for Search Results” patent. This patent appears to describe the underpinning of  SGE and provides insights into how it works.

How is Google’s SGE trained?

Google’s SGE is trained on several large language models (LLMs) and has also been specifically trained for search-related tasks. For instance, identifying high-quality web results with associated sources that confirm the information provided in the output. These models work alongside Google’s core ranking systems to deliver helpful and reliable results that are relevant to search-user queries.

What are LLMs?

LLMs are machine learning (ML) models that excel at understanding and generating human language. LLMs are something that people have now become accustomed to interacting with on a day-to-day basis. Key examples include Chat GPT and Google’s Bard, which are underpinned by powerful LLMs.

LLMs are a form of Generative AI, meaning the AI model can generate something new. They can perform numerous tasks, including summarising, translating, and rendering text.

An LLM is made up of three key components: data, architecture (a neural network transformer) and training. The transformer architecture allows the model to handle data sequences, for example, lines of code or text sentences. Training is where the model learns to predict the next word in a sentence. The model will keep iterating and improving its predictions until it is reliable for generating sentences. Fine-tuning an LLM allows the model to excel at a specific task.

LLMs learn about patterns and language from the extensive data sets they are trained on. They can then create outputs for inputs. For instance, if we give it a string of text “can’t judge a book by its”, it will predict the next word and likely output “cover”.

What LLMs underpin Google’s SGE?

SGE utilises a number of LLMs, including an advanced version of Multitask Unified Model (MUM), PaLM2, LaMDA and more. All of the LLMs are trained on vast amounts of data. Google uses multiple LLMs in SGE as it enables them to fine-tune the models to users’ unique needs, enhancing the search experience.

Google’s Multitask Unified Model (MUM) has been trained across 75 languages and has already been deployed on Google Search to improve the search experience. For instance, this model has been used to identify related topics in video content even when the topics aren’t directly mentioned.

PaLM 2 is a language model that excels in multilingualism, reasoning and coding. This is because it is trained heavily on multilingual text, large numbers of scientific pages containing mathematical expressions, and many publicly available source code datasets. Like all the others, this model is not limited to use in SGE. Google has also employed this LLM in Bard to enhance its language capabilities.

SGE is also employing Gemini to make it faster for users. Gemini was announced at the end of 2023 and is natively multimodal. Multimodal AI means it can understand and generate data across various modalities, including text, imagery and audio. Google noted they achieved a 40% reduction in latency for users in SGE.

What does the US11769017B1 Patent mean for Google’s SGE?

The patent filed by Google in March 2023, named “Generative summaries for search results” was approved on the 26th of September 2023. It appears to be the patent underpinning Google’s SGE. This patent details an approach to using large language models (LLMs) to generate a natural language (NL) summary in response to a query. Meaning that the summaries are created in a way that is easy for a user to understand. The processes outlined in the patent are key to SGE. It covers not only how LLMs are utilised, but also where information could be pulled from to generate these NL summaries.

The patent also outlines how additional context will be considered for each query. This means there will be variability depending on the specific way a query is submitted or the context in which it’s asked. This explains why so many people researching and collecting data on SGE are finding so much variability in what shows up for them on different days or locations.

Additional information stated in the patent that could be utilised: 

Can SGE be Wrong or Biased?

Google has stated that SGE may have knowledge gaps in certain areas and has, therefore, been designed only sometimes to produce a result if it concludes that it needs more knowledge to answer the query confidently.

Additionally, the patent describes a system where the generated summaries will be evaluated based on the probability that they are both reliable and accurate. Confidence measures will be used to assess the natural language summaries to determine whether or not to produce a summary for a specific query. For instance, Figure 2 from the patent outlines the method and illustrates how confidence measures are implemented.

Furthermore, it is essential to recognise bias when it comes to generative AI. Google has acknowledged that SGE could produce biassed results. Google have stated the following:

“The data that SGE is trained on is based on high-quality web extracted data that can exhibit narrow representations of people or potentially negative contextual associations.” – Google

Google has implemented multiple measures to try to mitigate biased results. For instance, they use adversarial testing in SGE. Adversarial testing “involves proactively trying to “break” an application by providing it with data most likely to elicit problematic output.” (Google). This aims to identify bias and safety concerns in the model and use this information to improve the model.

Wrap Up

The SGE Google experiment is an exciting time for generative AI and SEO. At Varn we are constantly monitoring and testing the updates to SGE to ensure our clients can appear within AI generated results. We have recently looked at the overlap between SGE and organic search results and we are continuing to collect and analyse data as SGE evolves.

If you have any questions about SGE, how it works and how it appears to be impacting SEO, please get in touch with the SEO Experts at Varn. We would love to hear from you.

It has been a big year for artificial intelligence; so much so that Collins Dictionary made AI the most notable word of 2023

AI has actually been around for several decades, but since ChatGPT burst on the scene in November 2022 it has been a huge global topic of conversation.

Developments in the technology are moving fast and businesses around the world are using AI tools for various tasks. But there are also ethical concerns around bias, privacy, plagiarism and accuracy. And what does it mean for sectors like ours which has a big focus on human creativity?

Dan Martin asked members of Bristol Creative Industries to tell us how they are using AI, the tools that are most useful, any concerns they have about the technology and what impact they think it will have on the creative industries. 


“In the field of design, AI has emerged not as a threat but as a formidable ally. It serves as a creative collaborator, an ever-available helping hand that can assist designers in unleashing their true potential and the potential of their ideas.

“AI algorithms can analyse vast datasets, identify trends, and provide inspiration that might have eluded humans. It extends the possibilities of what we can create, helping us push the boundaries of design. However, it’s crucial to recognise that AI doesn’t replace creativity – it’s an enhancer – part of a unique and collaborative team that will do great things together.

“As we embrace AI in design, we must tread carefully regarding ethics. AI’s capacity to mimic styles, artists, and writers raises important questions about originality, plagiarism, and intellectual property.

“We must recognise the ethical implications of using AI and we should establish guidelines and standards to ensure AI is directed fairly and honestly to constantly check ourselves and each other, always questioning the work we produce. Integrating AI into agencies is inevitable and holds tremendous promise. When used with integrity, it is a transformative force that can elevate our creativity, efficiency, and impact.

AI tools we use regularly include Chat GPT, Midjourney, Dalle and Vizcom. We are currently looking for the best opportunities to use Loops and Boords.

Ryan Wills, Taxi Studio
View Taxi Studio’s BCI profile


“There is one tool that is showing promise and potential; Claude. We’re finding it more of a helpful tool than say ChatGPT because it allows you to attach files. It’s also good for breaking up documents and summarising. It’s in beta, so it’s pretty rough and ready, but we’ve done things like ask it to create shot lists from video storyboards and it does a fairly good first job.

Dalle3 has been put into Bing, and that’s a fun tool. It can create nice pictures in just one prompt. It’s been handy for stock image creation, but it’s limited. For example, it can’t create anything beyond a 1×1 ratio.

“Whereas Midjourney is a much more powerful image generator, but you’ve got to put a lot more into the prompts and variance tweaking. They are interesting, but I don’t see them replacing a creative function. You still need the creative vision to make them work.

“As an agency, we’re carefully exploring the risks and ethics of such tools in our sector. The Daily Telegraph and The Guardian are two examples where a cautious approach to AI is being taken and we anticipate that other publishers will follow suit. The Daily Telegraph has recently issued guidelines to staff prohibiting publishing AI-generated text except in limited cases with legal approval and The Guardian is vying to maintain transparency around human-produced journalism.

“We’re mindful that we’re still in the infancy of AI and advancement is bubbling away. As they stand, they’re not replacing anything or anyone, more that they are a string to the bow. It still feels like there’s a long way to go with AI tools but we’re excited to see what’s to come.”

Sarah Woodhouse, Ambitious PR
View Ambitious PR’s BCI profile


“As a video production team, we’re really receptive to AI based tools that improve our editing workflow. We were early adopters of the beta version of Adobe Premiere Pro‘s text based editing tools, which really speeds up sync selection when dealing with multiple talking heads. We also use the enhance speech function to improve audio quality when location recording sound is compromised; this has varying degrees of success, but it’s good to see the capability evolving.

“An enthusiasm for exploring AI was an important factor in hiring our two newest members of staff and we try to make time for them to try out new applications.

“So far AI has been a positive experience for us, especially where it speeds up mundane tasks. We welcome it as a tool to further human creativity rather than undermine it. However we do appreciate this may be influenced by our team members requiring multiple skills rather than focusing on a specialised area of post-production.”

Penny Beeston, Beeston Media
View Beeston Media’s BCI profile


“We want to position AI as a complementary tool, rather than a replacement for strategic communications and copywriting. We are using specialist applications like Jasper and ChatGPT to add speed and scale to parts of the content creation process. We’ve focused this year on building our own internal experience and knowledge but there have already been a few projects where we have been able to show our expertise.

“Generative AI is like having a really inexperienced intern. It needs to be provided with the right information, instructions and tasks to get the best results. And because its responses are based on already existing content, the answers AI provides are always generic. While it might give us a starting point to add our own creative and strategic thinking to – it is nowhere near being able to create something we’d be happy to send to clients.

“As an agency and an industry, we must make sure this technology doesn’t replace actual human interns. That’s why we are continuing to take on interns from local universities next year. We want to help people gain the experience they need to build rewarding careers in the creative industries.

Darren Clare, Stratton Craig
View Stratton Craig’s BCI profile here


“Generative AI has felt a bit like discovering the internet for the first time; it truly is magic. When it comes to the practical business benefits, it has started to automate some simple tasks. We can edit podcast episodes and the accompanying social clips easily on Riverside.fm, do some image manipulation and we’ve started to experiment with its ability to write social media posts.

“There’s likely to be a big opportunity for creative businesses, if you embrace what’s possible and think about how it’s going to impact the way you work. We just need to be careful that we’re still supporting the development of people joining the industry — it’s the tasks they do that are being taken over — and that marketing focuses on quality and community. There’s going to be a lot more content created, so you’ll need to work harder to stand out!”

Chris Goodfellow, Inkwell
View Inkwell’s BCI profile here


We recognise that the pace of change of technology is accelerating exponentially. By becoming comfortable with that and working within guardrails that support this change, we can identify and take advantage of areas of opportunity for our clients as well as for our own business.

With a focus on human centricity, our strategy is broken down into a couple of key areas. It all starts with open dialogue and conversation, from board level and across the organisation. We have AI champions from each part of our business, who are busy identifying the key challenges and opportunities within their own specialisms, from motion to finance. This allows the prioritisation for areas of focus, so that new tools that may not be in use already can be tested.

“We’re also putting in place a clear vision statement that aims to set boundaries within which innovation can flourish. For any concerns that may arise when using a tool, we’re setting up a bias council, so that these can be raised and explored to avoid the discrimination.”

Emma Bass, Six
View Six’s BCI profile here


“I think it would be foolish for us to deny the potential held by artificial intelligence. When it comes to research, data collection, analysis and reporting etc, AI tools are brimming with potential. They’ve certainly helped us streamline processes and massively improve overall efficiency.

“However, there are certain processes that simply cannot (at the moment), and should not, be replicated by computers.

“Be it copywriting, content creation, web design or strategy, we strongly believe in the value of creative human input across all our work. AI is unable to capture the essence of brands, clients and individuals, no matter how hard it tries.

“Yes, these tools will continue to evolve and improve, but our focus will remain on the knowledge and experience of people, not machines. They’ll help our people be more productive and reduce time spent on monotonous tasks but they lack that spontaneous and creative flare.

In terms of tools we use, it’s all ChatGPT at the moment. The plan is to create unique GPT models catered to all of our clients and work types. This will allow us to train models based on client-specific documentation like style guides, landing pages and SEO best-practice docs.

“We have previously been combining third-party plugins, such as Browser OP and Webpilot, with specifically engineered prompts. As GPT is improving at a rapid pace though, third party browser plug-ins are becoming less important and these features are becoming native within GPT.”

Paul Morris, Superb Digital
View Superb Digital’s BCI profile here


“AI cannot innovate, only replicate – at least in its current form. AI tools like the infamous ChatGPT are able to create things based on existing data that they have been shown. This inability to produce something from nothing, or to create an entirely new direction for a web service like development or design, means it is unlikely to serve a threat to the industry.

“That doesn’t mean we shouldn’t use it, however. AI is an incredibly useful tool in oiling up the creative process, making certain aspects like research and insights go much faster. However, it probably won’t replace traditional creative practices for companies that want something bespoke.

“For web design, and websites in general, most organisations want to speak to humans. That’s because they’re led by humans, and the services they offer are invariably for humans. This common factor means that human-led design is the most logical thing to do, both now and in the future.

“There will be organisations that embrace AI to the extremes, even creating full AI websites. The impact of this is most likely to be seen in entry level projects for people that want a site quickly such start-ups.

“There are indications that Google penalises websites that utilise full AI content. Furthermore, as AI is in its infancy, we’ve yet to see how it will evolve both legally and functionally – restrictions on things like use of content and copyright laws could come into play down the line.

“For now, it’s best to think about the good that comes from AI – the speedy research, the ideas generation, and the surface-level insight into a topic – rather than assume it’ll replace jobs or the creative industry as a whole.”

Nick Bird, Squarebird
View Squarebird’s BCI profile here


“We have integrated many video production based workflow enhancements using various AI tools, and have regular meetings to discuss its wider impact. Many conversations conclude with, “It’s good, but not great, and not yet ready” or “It feels like AI” etc, but the fact is that with AI’s exponential growth curve, it will be ready and soon!

“The level of disruption AI poses to business is both exhilarating and overwhelming, and we’re hoping that the benefits will balance the effort, money and time it takes to stay up to date with fast paced change, to test what’s right for each business and to integrate it successfully into better workflows.”

Sam Hearn, Omni Productions
View Omni Productions’ BCI profile


“AI in one form or another is used pretty frequently to support our team. I use the word ‘support’ intentionally.

“ChatGPT came with a lot of hype, but after the initial novelty wore off, we found that the majority of use-cases are typically rephrasing meeting notes or making email notes more concise – though creating approaches and frameworks for ideas has been more valuable.

Voice AI is looking promising as a low-cost solution for social media assets, though it certainly doesn’t replace a voice over artist (yet). Again something we are experimenting with and will likely make more use of in the near future.

“Text-to-image tools are interesting. Photoshop has some half-decent inbuilt generative tools now, but in more powerful scenarios we’ve been able to build environments using MidJourney to support our creative. Notice the use of ‘support’ again? It’s not perfect, but it does help rule out what’s not working.

“Finally, and perhaps the most unusual use for us so far, was collaboratively working to set up an online chat-game for Conan the Barbarian. We worked in partnership with a text prompt engineer to craft a framework for a text based adventure game called ‘Tavern of Treachery’. Reviewed by PC Gamer, the campaign ran for the first two weeks of launch and generated over 3,000 unique AI encounters with Conan.”

Rich Williams, Something Familiar
View Something Familiar’s BCI profile here


“Like many teams, we used AI before ChatGPT burst onto the scene. Spending time on it has helped us find ways to bolster our work. Some of these findings will be shared in the chapter of a book called #PRStack, which is due out later this month.

“In our team meetings, AI chatbots record discussions and outline actions. A close friend’s creation, parker.ai, promises to enhance this.

“We’re also looking at how AI can manage tasks like analysing engagement feedback efficiently, allowing us to dedicate more time to strategic, high-value work.

“There’s no doubt it’s a game-changer, but I’m apprehensive about two things and irritated by a third.

“Firstly, like with social media, I worry that optimism bias will stop us from collectively recognising the potential for bad actors to wreak terrible harm. There is an important role for regulators here to ensure the right controls are in place.

“My second fear is that we fail to grasp the opportunities AI presents.

“My third concern comes from a long-standing bugbear: that the landscape between organisations and the public becomes even more littered with bad tech ‘solutions’ (chatbots and ‘convenience’ parking, I’m looking at you).

“That’s why I’m convinced that the AI success stories will be the ones who integrate it into their work while keeping people at the heart of what they do.”

Ben Lowndes, Distinctive Communications
View Distinctive’s BCI profile here


“Performing initial discovery can be a time-consuming process, and even briefing it to someone else can seem like a chore. While it still requires a knowledgeable eye to read and verify responses, ChatGPT 3.5 has been a good tool for researching business sectors, personas, and helping to draft workflows. This has helped to cut down time on the tedious tasks in order to be able to focus on the fun ones.

“But AI isn’t a magic bullet (not yet, at least) as the results you get from these tools are only as good as the prompts you give them, much like the benefit of writing a good brief.

“Within early stage concepting especially AI assisted workflows are all about maximising your time and output. It goes without saying Photoshop’s Generative Fill has been a gamechanger – beyond that we are also using generative AI models like Stable Diffusion to help bring our ideas to life.

“The benefit of AI for people within the creative industries feels vast. For any concerns about its negative impacts, we can only look to other historical comparisons. Maybe the impact of the letterpress on calligraphy in the mid-15th century? Or the democratisation of the personal computer and home printer through the late-20th century? For now, these are all just tools and as long as we take care to use them appropriately, we can keep our excitement to use them!”

Neil Sims, Oakwood
View Oakwood’s BCI profile here


“Our first exploration of AI began with using early deepfake technology to recreate a young Ian Wright, for an Adidas 90s apparel re-launch project. AI has since become a key part of our workflow, e.g. in upscaling renders and adding in-between frames. This was essential for a 10 minute projection mapping project requiring 8K renders at 50fps, an impossible task without the AI tool Topaz Labs Video AI.

“We utilise the evolving AI features in tools such as DaVinci Resolve for tasks such as object removal, rotoscoping and denoising. We’ve enhanced efficiency and precision, making processes quicker and more streamlined.

LLMs like Bard and Chat GTP provide workflow advice on complex VFX challenges in software like Houdini, accelerating R&D and discovering more effective solutions.

“We’re also exploring AI for early-stage ideation and generating visual elements, a shift from traditional reliance on stock imagery.

“From our experience, there is no doubt AI will play a significant role in VFX and asset development. Although AI hasn’t yet perfected executing vision, its contribution to creativity and ideation is undeniable.

“Nuance, especially when iterating with clients and artists, means a production process where people have ultimate control is still essential.”

Sama Alyasiri, Nymbl
View Nymbl’s BCI profile here


“We predict that AI will be disruptive but not fatal. We have been integrating AI into our work flows in a way which benefits our clients and as our business is all about SEO, is in line with Google’s guidelines. We believe, and the results from our clients show, that the core principles of good SEO will remain the same, even if the tools we use to deliver those results will evolve.

“It’s about strategy, and how to use it. A bit like when steam engines were invented, and everyone panicked about their impact. The ones that succeeded were the ones that learnt how to use the steam engine to their benefit rather than ignore it or try and stop them. Therefore, it will not change the core pillars of SEO, but it’s a tool that can be used to assist us in helping to make our clients visible online.

“One area to keep an eye on in search marketing, is that we are noting some of the biggest visual updates to Google SERPs that we’ve seen in some time – with Google’s Generative AI-powered Search Experience (SGE). Context will be carried over from question to question, to help you more naturally continue your exploration and these AI generated search results are very different from what we’re used to seeing. We expect more growth and development within SGE, over the coming months.”

Tom Vaughton, Varn
View Varn’s BCI profile here


“We’re getting great results when it comes to brand messaging. Copywriting at the branding stage can be like picking apart a plate of tangled thoughts, ideas and content, and then putting them into a coherent order.

“AI offers us strands to pull at sooner, giving us developed phrases that we can adapt and shape and link to. Yes, most of the output is cliché or inappropriate or wildly off-tone, but it’s a much better starting point than a blank page, a blinking cursor and a looming deadline.”

Simeon de la Torre, SIM7
View SIM7’s BCI profile here


“We’ve been working on AI related-projects for our clients since 2019. We view AI integration as largely beneficial; embedding AI and machine learning in products and workflows offers advantages such as enhanced efficiency, superior decision-making, and enriched customer journeys.

“Understandably, skilled individuals across the creative industries may have concerns. Yet in my 20+ years in digital media, I’ve found new technology always requires time to find its place. I consider it a new toolkit that should ultimately liberate us to focus on the really smart stuff that relies on human creativity, curiosity, heart, soul and intuition.”

Dave Harrison, Spicerack
View Spicerack’s BCI profile here

Chancellor Jeremy Hunt delivered the government’s 2023 Autumn Statement on 22 November. Here’s a round-up of measures and announcements relevant to businesses in the creative industries.

If you’re a Bristol Creative Industries member and you’d like to share your view on Autumn Statement 2023, email Dan.

In the full Autumn Statement document released after the speech, the government says:

“The UK has world-leading creative industries at the heart of an increasingly digital world. The sector grew at over one and a half times the rate of the wider economy between 2010 and 2019,73 contributing £126 billion in GVA in 2022.

“In June 2023, the government published its sector vision which set ambitions to grow the creative industries by £50 billion and deliver a creative careers promise to support a million more jobs by 2030. This included £77 million in new government spending, bringing the total announced since the 2021 Spending Review to £310 million. The sector also continues to be supported by significant tax reliefs, which were worth £1.66 billion in the year ending 2022.”

Autumn Statement 2023 measures for creative industries

In terms of specific announcements for the creative sector, the Autumn Statement included the following:

Plans to boost tax reliefs for visual effects

In his speech, Jeremy Hunt said:

“Our creative industries already support Europe’s largest film and TV sector. This year’s all-Californian blockbuster Barbie was filmed in the constituency of the Hon Member for Watford, where the sun always shines.

“I know that even more could be invested in visual effects if we increased the generosity of the film and high-end TV tax credits, so I will today launch a call for evidence on how to make that happen.”

In the consultation document published after the speech, the chancellor says:

“I can confirm that we will provide more additional tax relief for expenditure on visual effects, to boost the international competitiveness of the UK’s offer. This call for evidence takes the first, crucial step towards this, as it will provide the government with the depth of understanding it needs to develop targeted proposals that best serve the needs of the visual effects industry.”

The call for evidence follows the announcement in the government’s March Budget that audio-visual tax reliefs are being reformed into expenditure credits with a higher rate of relief than under the current system. The new expenditure credits can be claimed from 1 January 2024.

The government is looking for comments on how the new Audio-Visual Expenditure Credit can provide more generous tax relief for visual effects by 3 January 2024.

Neil Hatton, CEO of UK Screen Alliance, said:

“Very often clients will say, ‘We love your creativity, your innovation and reliability to deliver. We love your people and we love working in the UK, but your VFX incentive doesn’t meet what is available elsewhere in the world’. Either that or, ‘We are capped out’.

“This consultation and the promise of a more competitive incentive should aim to position the UK as the first choice destination for VFX production for international film and TV. We are focused on capturing a larger market share as global demand for VFX recovers, following the US actors’ strike, and we aim to play a full part in growing the UK’s creative industries towards the government’s 2030 targets.”

Funding for film and high-end TV

The government said it will provide £2.1 million of new funding next year for the British Film Commission and the British Film Institute Certification Unit “to support the production of film and high-end TV across the UK”.

The unit certifies films, animation, television programmes and video games as British or as an official co-production in order to be eligible for the UK’s creative sector tax reliefs. The BFI has previously said that it lacks funding to adequately adminster the certification.

Review of public investment in R&D spending

The government said it will review public investment in R&D spending for the creative industries “to a Spending Review timeframe”.

Business rates

The 75% business rates relief for hospitality, retail and leisure businesses up to a rateable value of £110,000 has been extended until 2024-25, and the small business rates multiplier has been frozen at 49.9p.

The standard multiplier (paid by properties with a rateable value of £51,000 or more) will be uprated by September’s CPI next April (6.7%). This will mean an increase from 51.2p to 54.6p.

Clarifications of rules for cultural tax reliefs

At the Spring Budget, the government extended the temporary higher rates of relief of three corporation tax reliefs that are collectively known as the ‘cultural reliefs’:

In a policy paper published alongside the Autumn Statement, HM Revenue & Customs said “the government is taking the opportunity to clarify some of the relief rules, including what is eligible for relief” in order to “provide clarity to the industry and ensure the fairness and success of the cultural reliefs”.

Administrative changes to creative industry tax reliefs

In a policy paper published alongside the Autumn Statement, the government said companies claiming tax relief for films, high-end TV, animated TV, children’s TV, video games, theatrical productions, orchestral concerts and museum and gallery exhibitions will be required to complete and submit an online information form.

This include claims made for the new expenditure credits: the Audio-Visual Expenditure Credit (AVEC) and the Video Games Expenditure Credit (VGEC). It also includes the cultural tax reliefs: Theatre Tax Relief (TTR), Orchestra Tax Relief (OTR) and Museums and Galleries Exhibition Tax Relief (MGETR).

The reason given is as follows:

“It is expected that these changes will streamline the process of making a claim and assist companies in transitioning to the new regimes. The completion and submission of the online information form will make it easier to tackle abuse and will reduce the administrative burden on HMRC, allowing claims to be processed faster.”

For AVEC and VGEC, the measure will take effect from 1 January 2024. For the cultural reliefs, it be introduced from 1 April 2024.

National Insurance

Freelancers make up a third of the creative industries so these measures are very relevant to the sector.

From 6 April 2024, self-employed people with profits above £12,570 will no longer be required to pay Class 2 National Insurance Contributions, while the main rate of Class 4 self-employed NICs will be cut from 9% to 8%.

The government said the changes will benefit around two million self-employed people with the average person earning £28,200 seeing a saving of £350 in 2024-25.

AI compute

Compute powers the development of AI models.

The goverment will invest £500 million in further UK based compute over the next two financial years “so that universities, scientists and start-ups have access to the compute power they need to help make the UK an AI powerhouse”. This brings the total planned investment in compute to more than £1.5 billion.

General measures of interest to the creative industries

The following are announcements not specific to the creative industries but are of interest to businesses in the sector.

Full expensing

Full expensing allows companies to claim 100% capital allowances on qualifying plant and machinery investments and write off the cost of investment in one go. For every pound a company invests, their taxes are cut by up to 25p.

The measure was due to end in March 2026 but the Autumn Statement announces that it has been made permanent.

The government said it worth over £10 billion a year and “the biggest business tax cut in modern British history”.

Late payment

Alongside the Autumn Statement, the government published the report from its Prompt Payment and Cash Flow Review.

Measures aimed at tackling the impact of late payment on small businesses include:

Funding for high growth businesses

The government will establish a Growth Fund within the British Business Bank to provide a permanent capital base of over £7 billion “to give pension funds access to investment opportunities in the UK’ s most promising businesses”.

Exentions of the British Business Bank’s Future Fund: Breakthrough programme with at least £50 million additional investment in “the UK’s most promising R&D intensive companies”.

A new Venture Capital Fellowship will be set up with the aim of “producing the next generation of world-leading investors in the UK’s renowned venture capital funds to support investment into the UK’s most innovative high-growth companies”.

A new £20 million cross-disciplinary proof-of-concept research fund to boost spin-out businesses in universities.

Research and development tax relief

The existing Research and Development Expenditure (RDEC) and SME schemes will be merged for expenditure incurred in accounting periods beginning on or after 1 April 2024. The notional tax rate applied to loss-makers in the merged scheme will be lowered from 25%
as per the current RDEC scheme, to 19%.

The intensity threshold in the additional support for R&D intensive loss-making SMEs will be reduced from 40% to 30%, which the government says brings around 5,000 more R&D intensive SMEs into scope of the relief.

A one year grace period will be introduced so that companies that dip under the 30% qualifying R&D expenditure threshold will continue to receive relief for one year.

National Living Wage and National Minimum Wage

From April 2024, the National Living Wage (NLW) will increase by 9.8% to £11.44 with the age threshold lowered from 23 to 21 years old. The government said it represents a rise of over £1,800 to the annual earnings of a full-time worker on the NLW.

There are also changes in the National Minimum Wage:

National Minimum Wage 2024

It’s no secret that the digital world has a considerable environmental footprint, rivalling that of the aviation industry. Local digital agency Torchbox has embarked on a journey to quantify its Scope 3 emissions, in particular in relation to the carbon footprint of the websites they build and manage. 

 

Drawing insights from projects like Thoughtworks’ Cloud Carbon Footprint and the Green Software Foundation’s Software Carbon Intensity specification, Torchbox has crafted a bespoke approach to address this complex issue.

Their Hosting Infrastructure Methodology provides detailed insights into emissions from website hosting and the Sustainable Web Design Methodology casts a wider net, estimating emissions linked to overall website use.

Exploring digital emissions while deepdiving into Scope 3 led to some interesting debate about what should be included within Scope 3 when it comes to the hosting of clients’ websites with hosting partners. 

Informed by the GHG guidance Torchbox decided to include these emissions using the Hosting Infrastructure methodology as part of their Scope 3, even though the clients are the primary owners. The GHG Protocol encourages this double counting so that more parties are focused on driving reductions.

Measurement models will evolve and new standards will emerge but, for now, this approach provides Torchbox with what it needs to start a reduction programme. And, they can report an accurate picture of hosting emissions for their clients’ carbon footprints

You can download Torchbox’s Digital Emissions Methodologies. Feedback is not just welcomed; it’s encouraged as they are keen to engage in ongoing dialogue, to continue to refine and enhance their approach.

The idea of flexible working has been around for decades. Christel Kammerer, a German management consultant, proposed the idea of ‘flexiwork’ back in 1965 as she identified balancing childcare with work responsibilities as the reason for a lack of women in the workforce.

This concept didn’t seem to gain real or widespread traction until the Covid pandemic. I think we can all agree that this period introduced a monumental shift, both in the way we live and the way we work. Social distancing, face masks, and hand sanitiser all became common practice in a matter of months. But so did remote working and the introduction of more flexible working patterns.

Perpetual Guardian, a privately held company in New Zealand, were the first of their kind to successfully trial a 4-day work week in 2018, before Covid. So, this working example, paired with the introduction of widespread flexible working during the pandemic both had a huge role to play in the shift to 4-day weeks becoming more common.

We are really happy to announce that at Proctors, we are trialling the 4-day workweek for 3 months, having commenced on the 3rd of July. We understand the importance of listening to our team and supporting them wherever we can, and this represents an exciting milestone in our ongoing journey.

Corporate Social Responsibility (CSR) is incredibly important to us, and we understand that even though we are on our journey to be the best that we can be for our people, our planet, and our community, we still have more steps to take to get there. Implementing the 4-day workweek is one of the crucial steps we are taking towards fulfilling our commitment to these values.

 

CSR

If you’ve visited our CSR page, you’ll know that our mission statement is all about putting actions behind our words. We don’t just talk the talk; we always try to walk the walk. And as an agency, we’re driven by purpose, whether that means doing something as small as recycling our rubbish or as grand as hosting an annual student awards ceremony. If it can be done, we’ll strive to do it. This is why we’re implementing a 4-day workweek, because we want to further our positive changes and improve the business structure for our team.

So, what are we doing already, you may ask? Well, here’s a sample…

The office building itself was salvaged by us, reclaimed from an old printworks. We also added 90 solar panels to the roof, which to date have generated 159,758 kWh of energy.

We’re also the only building in the UK coated in CristalACTiV, a coating that reduces atmospheric pollution in the surrounding area.

We have 6 electric car charging points, automatic energy-saving light bulbs, increased insulation, a living wall, compost and recycling, a cycle to work scheme, and that’s just some of the environmental initiatives we undertake.

We also have a number of community-focused CSR initiatives, from the South West Design + Digital Student Awards to working with charities to donate our resources and help them raise awareness and money.

Which brings us back to our people-focused CSR: from providing an inclusive workspace and hosting the PrOscars, to offering mental health first aid to our employees and internships for creatives taking their first steps into the creative industry.

These are just a few examples, but now we are proud to add the 4-day workweek to the list of ways we are trying to improve our CSR journey.

 

Benefits

But we haven’t just decided to implement this change for fun. We extensively researched the benefits and implications of this initiative to ensure this was going to be a success.

So, without further ado, here are some of the benefits of a 4-day workweek:

 

Improved work-life balance

You know that hobby you’re always putting off because you just don’t have the time? Or that volunteering scheme you were always interested in joining? Or even that extra time you’ve been meaning to carve out to spend with your loved ones?

With an extra day off work, this gives our employees the chance to make the most of their time, however they may wish to spend it.

Increased productivity

More than 95% of the companies in the 4-day workweek trial saw no decline, or even noticed an improvement in productivity, and nearly 15% said that this had improved “significantly”. This is due to the fact that happier, and more content employees are more focused on their jobs than those who are dissatisfied or unfulfilled.

With unhappy employees often being more distracted and, in some cases, distracting others, it makes sense that introducing a 4-day week would cut down on this and boost focus. In addition, employees are likely to have more energy with an additional day away from work, which adds to the improved productivity.

Reduced work stress

Going hand in hand with the above point, the overall mental health of our team members is incredibly important to us. Implementing a 4-day workweek may stand to improve this, with a reduced stress level regarding work due to the additional time off per week. We believe that being able to approach work with a clearer and more refreshed perspective is highly likely to show benefits in this area.

Lower emissions

Around 45% of workers in England and Wales drive to work. Cutting out even one day of commuting for people by introducing a 4-day workweek will build up to have a huge impact on commute-related carbon emissions. With fewer cars on the road, congestion will decrease, and there’ll be a reduced environmental impact. In addition, even though our office is remaining open 5 days a week, as fewer people will be in on certain days, the office-based emissions will also decrease.

 

Interviews

But don’t just take our word for it. Let’s hear some Proctorians’ thoughts on the 4-day workweek.

 

Chris Harris, our People Partner

Why was the decision made to go ahead with the 4-day week in the first place?

The idea came from a discussion following a review of the feedback we received from our employee engagement survey. We were looking at what we could do that would have the biggest impact on our people.

At the time it was mentioned, I thought we would just do some research and then move on to the next idea. Following the research and looking at different ways we could make this work, the idea started to grow, and the Directors made the decision that a trial would be the next best step.

The key factors that we considered related to the impact we saw in other companies that have taken this approach and how it improved people’s wellbeing and productivity.

What are your thoughts on the new initiative? 

I am excited to see what impact it has on how we approach our work. My thoughts are currently focused on making sure we allow people to think through any obstacles they may come across during this trial. Change is tough, and being available to help our people and teams navigate these obstacles will build our capabilities as a group and as individuals.

What do you hope to see from doing this?

I hope to see a group that realises its potential and starts to challenge our previous ways of thinking. Taking us forward and being contributors to how we operate as a group and business.

What do you plan to do on your extra day off work?

I am really looking forward to getting those boring chores done that mean I can have a full weekend with the family and not have to worry about it!

 

Ailsa Billington, Managing Director

Could you tell us more about the decision behind implementing the 4-day work week?

After conducting one of our regular company engagement surveys, it was clear to see that a good work-life balance was one of the key things members of the P+S team were looking for.

One of our core values is taking care of each other, so prioritising the well-being of our employees and making sure they are heard and supported is really important to us. The responses from the survey were the original catalyst into looking into and ultimately implementing the 4-day work week trial.

We also carried out a lot of research into how this approach has worked for other businesses, and the positive outcomes that resulted. We also gave people the option on a few different working patters to vote on, and the 4-day week was definitely the most popular outcome!

What would you like to see from this change?

We have an incredibly dedicated, talented, and hard-working team, and I believe that by embracing this change and continuing to nurture a positive work culture, it will provide rejuvenation and promote creativity and innovation across the business. I’m already hearing plans that people are making for their extra day off and I can’t wait to see what everyone gets up to!

 

Emily Hawkins, Junior Digital Designer

What are your thoughts regarding the 4 day week?

I’m so excited that we are trialling a 4-day working week! I think this new approach is very refreshing and progressive and shows how the company is adapting to prioritise a better work-life balance for employees. I think that having an extra day off each week will allow me to properly recharge and come back to work feeling more focused and motivated which will increase my productivity and creativity.

What do you plan to do with your extra day off work?

I’m hoping that I can use this time to focus on myself and pick up a hobby, perhaps a fitness class or something creative. I’m planning on trying out something different each week to see what I enjoy! I’d also love to learn a new skill, like photography or a foreign language, or to volunteer for a local organisation.

Spending more time outdoors is also really important to me and I’d love to use some of the extended weekends to explore more of the UK or even take some spontaneous short breaks abroad! I’m particularly looking forward to being able to travel back home and spend more quality time with my family and my dog.

I can’t wait to experience the positive impact that this change will have on everyone’s well-being and on the agency as a whole. 

 

Conclusion

We’re incredibly excited about this announcement as it signifies real change and innovation within the company. Listening to our employees is so important to us, and making sure their suggestions are valued is something we take seriously.

As previously mentioned, this 4-day working week trial will initially run for 3 months so we can see how well it is received by the team and how effective it is at improving our work lives. During this period, we will review the changes and effects, and discuss the option of continuing with it in the long-term.

If you would like to find out more about our corporate social responsibility initiatives, check out our CSR page here.

AI in the entertainment industry:

Why brand strategy is more important than ever

There’s no escaping artificial intelligence (AI) right now. Whether it’s facial recognition, your smart speaker or the latest Instagram filter, everyone is using AI – even without realising it. It’s in your social media feed, powering your digital payments, and even helping your phone or laptop to autocorrect.

Whilst some of it may seem like the stuff of science fiction, this is just the beginning. AI is no longer a technology of the future, so what can we expect, what does it all mean and should we be excited or concerned about its potential?

In this paper, we’ll take a look at the impact of AI on the entertainment industry, including what we’ve seen so far. We’ll then explore the potential, the implications, and how businesses and professionals can respond to industry change.

We’ll also discuss the importance of brand identity and how a solid foundation of brand strategy can help you to stay authentic, create cut-through and capitalise on the trend to avoid being left behind.

The growing power of AI

In a recent global artificial intelligence study, PwC estimated that the total economic impact of artificial intelligence will be $15.7 trillion in the period to 2030, making it “the biggest commercial opportunity in today’s fast changing economy”. And when we consider how many areas of our lives it’s already permeated, you can see why.

AI is essential in many of our day-to-day tasks, enabling automation, personalisation and even fraud detection. Most people are familiar with Virtual Assistants or Chatbots online, and are using apps to monitor traffic or weather conditions almost daily.

But AI and its machine learning (ML) subset are nothing new. The concept has been around since the early twentieth century, with science fiction depicting artificially-intelligent robots and dystopian futures, from Fritz Lang’s Metropolis in 1927 to franchises like Star Wars, Star Trek and The Matrix.

By the 1950s, the idea of artificial intelligence was cemented in the minds of scientists, mathematicians and philosophers the world over and, thanks to the development of computers and machine learning algorithms, AI flourished in the 60s and 70s. This continued into the 21st century, with more funding and computer storage bringing us to the age of “big data”.

The human capacity to collect data is now far outperformed by artificial intelligence, which can process huge amounts. Applying AI in this way has been successful in a number of industries including banking, marketing and social media, and of course, entertainment.

Current trends and popular tools

2022 was the year when AI became truly accessible, with the democratisation of Generative AI tools enabling the general public to use these algorithms to create pretty much anything, from the pope in a puffer jacket to Donald Trump’s arrest.

The big hitters in the space right now include OpenAI’s Generative AI model, ChatGPT, and image generators such as Midjourney. These algorithms take existing data and use them to create entirely new content.

Other examples include ‘deepfake’ technology, which uses AI to make it appear as though someone did or said something that they actually didn’t, by replacing the likeness of one person for another in audio or video.

Whilst there are legitimate concerns about the current trajectory of AI, it’s not showing any signs of slowing down, with the potential to improve efficiency, reduce the risk of human error and drive profitability.

From fiction to reality

Since artificial intelligence first graced our screens, television and film have continued to portray the future, with each reimagining of AI more elaborate and fantastical than the next. But now, the things we once imagined are becoming our reality.

In 2023, AI is having a huge impact on everything from imagery and video to set design and theatre robotics. It’s being used in sport to support officiating and by streaming platforms to recommend shows, films or music. It’s even written a play which premiered online in February 2021.

So, what does AI in the entertainment industry look like right now?

AI in the entertainment industry

Like many other sectors, AI has been making its mark in the entertainment space for a while, be it film, television, music, theatre or sport. The technology has already been applied in ways similar to other industries – such as content personalisation on streaming services like Netflix or Spotify – and it’s evolving all the time.

Both platforms use AI and machine learning to provide recommendations based on users’ preferences. Netflix even goes so far as to personalise thumbnails to entice users, by ranking hundreds of frames from movies and shows to decide which are most likely to encourage a click.

Spotify also took personalisation to a whole new level earlier this year, with the launch of its AI DJ feature. DJ is “a personalised AI guide that knows you and your music taste so well that it can choose what to play for you”, delivering a curated lineup alongside a hyper-realistic commentary.

Let’s take a look at how artificial intelligence is being used in other areas of the industry.

Film and television

We’ve already touched on film and TV’s long relationship with artificial intelligence, so what’s changed in the last near century? The short answer: a lot. In addition to personalised viewing recommendations and AI-powered distribution from streaming services, the technology is also being used in a myriad of other ways.

AI-powered platforms and machine learning algorithms are being trained and applied to casting, improving the accuracy and efficiency of decision making. They can also be used to enhance visual effects and even analyse
data of existing scripts to generate new, original stories.

It’s not uncommon for shows and films to be using machine learning or AI in some way or another, but its application in VFX is probably the most recognisable. Recent examples include Lucasfilm’s The Mandalorian where actor Mark Hamill was de-aged to depict a younger version of his original Star Wars character, Luke Skywalker.

Another interesting development comes from Texas-based company StoryFit, who are leveraging AI technology to compile data on storytelling elements in scripts. The platform helps writers and studios understand and better connect with their audiences, providing insights on character relatability, plot inconsistencies or even which books should be adapted for film.

Perhaps one of the most incredible applications of AI in film is the use of Neural Radiance Fields or NeRFs. This new powerful and low-budget VFX tool can learn how light is reflected in a scene and produce a 3D model that looks like it was shot on the same set. Using just a few input images, AI can fill in any gaps not covered by the camera and estimate how that section might look, creating light and manipulating images in ways previously unimaginable.

Theatre

As a traditionally human-centric art, theatre is perhaps an unexpected place to find the presence of artificial intelligence. But it is seeing development of AI technologies, from lighting robotics to set design and even playwriting.

Examples include the use of tools such as Midjourney for theatrical design, to create set designs in collaboration with AI, and plays written entirely by AI such as THEaiTRE: When a Robot Writes a Play or the Young Vic’s production of AI which featured the GPT-3 system on stage.

Theatre and the metaverse

The COVID-19 pandemic forced the world of live theatre into the virtual and digital space, with creators streaming live or pre-recorded performances to audiences at home. It also saw live theatre enter the metaverse, where AI has been integral to development.

Virtual reality and online spaces allow theatre to maintain its live identity, whilst providing new and more interactive ways for audiences to experience the narrative. One example is YouTuber Rustic Mascara’s appropriation of the video game space for live performance back in July 2022.

In an attempt to fill the live-theatre void during the pandemic, actor Sam Crane live streamed the first ever full production of Hamlet inside the online world of Grand Theft Auto. You can learn more about this and the future of theatre in the metaverse in ‘The Future of Theatre’ Conference from The Stage.

Sport

Emerging technologies such as AI, big data, and IoT (Internet of Things) are becoming essential components of sport in recent years – and there are already a plethora of applications. One of the most prolific is the introduction of technologies such as VAR (video assistant referee), goal line technology and Hawkeye, designed to help support officiating and decision-making.

Other examples include the use of computer vision for tracking and analysing human motion. Machine learning algorithms can use data to evaluate skills and player potential, ranking them to help with scouting or recruitment.

AI can also be used to predict results or ball possession, and provide game analysis, spotting trends, tactics and flaws.

Music

The music industry has already had its fair share of run-ins with AI, with mixed responses. We’ve highlighted the use of AI in streaming to personalise listening and improve user experience, but what about AI-generated music?

2023 has already seen AI make headlines in the music world, including a music generator that can turn any subject into a Drake-inspired record, or a new Oasis album that imagines how the band might have sounded if they’d stayed together.

But this new era of music making is not without controversy. When French DJ David Guetta used AI technology to add Eminem’s ‘voice’ to one of his songs, it sparked a debate about copyright and creators’ rights. Calls to ensure that artificial intelligence is used to support culture and artistry rather than replace it have been heard across the industry, something we’ll explore in the next section.

It’s clear that the industry is taking note and exploiting AI technology where the opportunity presents itself. But what’s the impact so far, is everyone excited about the potential of AI or are there concerns about its future?

The impact of AI on entertainment

The evolution and increasing popularity of artificial intelligence is controversial across all industries, with many recognising its benefits and potential, whilst simultaneously raising concerns about risk.

In April 2023, Avengers’ Director Joe Russo predicted that AI could be making movies within two years. This, coupled with reports that the first AI-generated feature film will begin production in May 2023 is enough to send filmmakers into a flat spin – our worst fears about robots taking over could be realised in an imminent dystopian reality.

The benefits

For the film industry, one of the biggest advantages of using AI is its ability to save time and resources. It’s also being used to improve accuracy and efficiency, analysing huge amounts of data – such as actors’ past performances or social media activity – to predict who is likely to be successful in a particular role.

This data analysis can also be used to analyse scripts and create new, original stories, saving time for screenwriters and providing opportunities for creativity and storytelling. AI can also save time and money on VFX, making it easier and faster to add visual effects, using NeRF and other technologies.

Theatre has already reaped some of the benefits of artificial intelligence in its ability to connect with larger audiences onlines. But there are also positives to be drawn from the use of AI in other areas.

Set designers Jason Jamerson and Michael Schweikardt discussed how tools such as Midjourney can be used to improve the design process, arguing that if used in the right way, it might help the process of materialising an idea for production. They explain that they don’t want AI to design the set, but it can give them new, interesting concept overlays whilst allowing them to remain the designers.

In sport, as well as helping to improve the accuracy of officiating – making sports fairer and less subjective – AI can also produce personalised training or nutrition plans for professional athletes. Thanks to the development of wearable technology which provides information about the wear and tear on an athlete’s body, AI can even help improve health and fitness or prevent injury.

Computer analysis is also used to influence line-up decisions before and during games. By comprehending metrics such as motion, speed, serve placement, and even player posture, AI helps managers and coaches make better decisions for their players and teams.

But having already seen how AI can be leveraged to support and improve traditionally-human tasks, what other positives might come from implementing this technology across the industry?

The risks

Despite these wide-ranging positive impacts, there are understandably concerns about the risks associated with the increased use of artificial intelligence. The most obvious is of course, the potential for AI to replace human jobs.

As algorithms and AI tools become increasingly advanced, there is a risk that they could replace some roles that humans would have historically carried out. Ultimately, this could lead to job losses in the industry along with a fall in creativity, uniqueness and emotional depth that only humans can provide.

The Guardian reported earlier this year that creatives across the industry are taking action against AI, in a bid to protect their jobs and original work from automation. Photographers and designers are among the first to face a “genuine threat”, and Hollywood filmmakers are worried that advances in the technology will mean fewer jobs across the industry and pose “a real threat to writers down the road”.

Another challenge to AI technology came in the wake of deep-fake technology being used for so-called ‘revenge porn’, with devastating consequences. Understandably, this led to wide-spread criticism and calls for further regulation in future developments.

There are calls for more regulation in other areas too. Apple’s development of synthetic voices for audiobooks has caused controversy and concern among the voice actor community. Some are worried about damage to the livelihoods of lesser-known actors and have pushed for the technology to advance more ethically.

Thoughts from the industry

So, will we see homogenisation or a decline in the quality of entertainment or art? Author and screenwriter Marthese Fenech thinks the technology needs regulation and a cautious approach. She explains, “I do very much understand and empathise with the concerns of my fellow creatives and artists. I still harbour some reservations about the technology; none of us wants to be replaced by a machine, something without a soul or the ability to emote.

“Admittedly, I am often reluctant to adopt burgeoning technology – it took me years to transition from an analog camera to a digital one. As an author, screenwriter, editor, and teacher, I’ve met the growing pervasiveness of AI with resistance and hesitation.”

However, having seen some of Mark’s work, Marthese has shifted her perspective: “Mark’s ability to completely transform a project from something passable to something transcendent has altered my perspective. To see something that has lived in my imagination for two decades come to life so vividly defies description.”

Don’t just compete – capitalise

There’s no denying the huge potential of artificial intelligence. Now, the entertainment industry has the chance to capitalise on the trend and do incredible things.

It’s time to start viewing AI as an opportunity, rather than a threat. So, how can creatives not only stay competitive but make the most of new technologies?

BIFA founders Raindance, explain that “by highlighting the value of human creativity, filmmakers can differentiate themselves from AI and justify their continued employment”. They also stress the importance of staying “competitive by continuous learning and adapting to new technologies”.

The combination of both AI and human ability has huge potential. By collaborating with AI experts or learning how to use the tools effectively, creatives can learn new skills and stay ahead of the curve. This is something Mark is very interested in, with plans to help businesses and brands use AI to their advantage.

Whilst AI is great at solving problems or processing large amounts of data, there are nuances and concepts that only humans can offer. Some tasks are difficult or even impossible for AI to complete, such as those requiring empathy, social skills or physical dexterity.

So how best to maintain humanity and protect originality? One way is to have a strong foundation – to know who you are and what you stand for. In other words, a brand strategy that really stands up.

The importance of brand strategy

As the entertainment industry becomes more saturated and AI tools are used to create content or marketing materials, it’s more important than ever to maintain originality and authenticity that can’t be replicated by machine learning. If you want to create cut-through in a competitive space, having a strong brand personality and a plan for how you’ll deliver your key messages are both vital.

Whatever your role and niche within the industry, every brand or business needs a unique and authentic voice, even if what you’re saying or selling is the same or similar to your competitors. As AI technology continues to develop, creating human connections with messages that really resonate will help give you the edge.

There are ample opportunities to use AI tools to help you learn more about your audience or find new ways to connect with them. But at some point, you’re going to need that human touch to make whatever you create uniquely yours.

The future of AI in the entertainment industry

When it comes to creativity, there’s always a need to protect what is sacred. But if leveraged in the right way, artificial intelligence could be – and indeed already is – hugely exciting and potentially beneficial for the entertainment industry and business owners that don’t have blockbuster budgets but need to reach their ideal clients.

By having a clear brand identity and a strategy to help you bring your message to your audience, you can remain authentic, stay relevant and make the most of any opportunities AI might throw your way.

I believe in bringing the joy of entertainment to as many people as possible and helping business’s both large and small achieve their dreams. With over 20 years’ experience in the creative space and a finger on the pulse of the latest technologies, I’m here to help.

 

Mark Horton

Brand Strategist / Creative Human  / Intrigued by all the latest Technological Toys.

(Note this article was researched and written by humans!)

What you need to know about Threads, which Meta officially introduced on social media last week: Threads has been hailed as the fastest-growing app ever after surpassing 100 million users at debut; all Threads fans, rise up! All social media managers are working long hours into the night to develop a social media plan for the new app since brands, celebrities, and other talent have already signed on to the site. Probably on your mind is: What precisely is Threads? Does Threads have a connection to Instagram? How does the app operate?

Well, we’re here to tell you all the ins and outs below.

What is Threads?

With the use of Meta’s dedicated text-based messaging programme, Threads, we may have more intimate interactions with our close friends and other people. The app opens to a scrollable feed of 500-character-maximum short posts with the option to add single or carousel images and videos. Similar to Instagram, you can utilise the site to offer your perspective and creativity or to follow and interact with your favourite producers.

Is Threads available in the UK?

In order to access Threads, you must have an Instagram account. On Threads, you can either follow people you have already connected with on Instagram or completely new people. However, deleting Threads will also deactivate your Instagram account.

Is Threads safe?

Since Instagram and Threads are both classified as Meta, they will both abide by the same terms of service. As a result, individuals who are older than 13 can use the software. Parental and guardian supervision of children’s use of Threads is advised to safeguard their safety. Additionally, Meta has stated that any Instagram accounts created by users under the age of 13 would have their privacy settings switched to private by default.

Does Instagram own Threads?

Yes, technically. Owner of Threads is Meta, which also owns Instagram, Facebook, and WhatsApp. Due to the primary appeal of Threads being text-focused and an opinion-sharing area, it does operate similarly to Twitter.

Cameron Balloons, a renowned leader in the hot air balloon industry, is proud to announce the launch of their groundbreaking app, Sky Sketcher. This innovative web application empowers users to unleash their creativity and design their very own personalised hot air balloons, revolutionising the way enthusiasts engage with the art of ballooning.

With Sky Sketcher, users can embark on a unique and immersive journey, where they have the freedom to customise every aspect of their hot air balloon envelope. From selecting colours, materials, and artwork to exploring different balloon models, the app offers an unparalleled level of customisation and personalisation.

One of our primary goals at Cameron Balloons is to make hot air ballooning an accessible and interactive experience for everyone,” said Will Offer, Senior Designer of Cameron Balloons. “SkySketcher represents a significant leap forward in achieving that vision. We wanted to provide a platform that allows individuals to turn their imagination into reality and design a hot air balloon that truly reflects their personality and style.”

The features of SkySketcher are designed with user-friendly functionality in mind. The app provides an intuitive interface, ensuring that even those new to ballooning can effortlessly navigate and create their dream aircraft. Users can experiment with colours and patterns, incorporate their own artwork, and find the perfect looks effortlessly.

SkySketcher allows users to visualise their creation in 3D. Adding this extra dimension provides users with a sneak peek into what their custom hot air balloon will look like floating high up in the sky.

“We are thrilled to introduce Sky Sketcher as a game-changer in the ballooning community,” added Offer. “It’s an app that encourages creativity, fosters self-expression, and allows users to engage with the world of hot air balloons like never before. We want to inspire a new generation of balloon enthusiasts!”

SkySketcher by Cameron Balloons is now available, bringing the joy of designing custom hot air balloons to the fingertips of users worldwide. To learn more about the app and start designing your dream balloon, visit https://www.cameronballoons.co.uk/skysketcher or follow Cameron Balloons on instagram or Facebook.

Launching today, SOL-ART Visions will see a series of artworks on display at the Tobacco Factory Cafe Bar from 17th – 31st July to showcase the reimagining of Bristol for the renewable energy transition.

The exhibition features conceptual artworks envisioning solar facades, by acclaimed artists AcerOne, Andy Council, Bex Glover, Dave Bain and Elaine Carr. The artworks were developed through a number of creative co-design workshops hosted by Upfest as part of the UWE Bristol project Towards Solar Facades as Participatory Public Art.

The project is a partnership between UWE Bristol, street art specialists Upfest, and the pioneering manufacturer of building integrated photovoltaic solutions BIPVco. The project is funded by the Arts and Humanities Research Council (AHRC), part of UK Research and Innovation (UKRI), under the Design Accelerator scheme (Grant Ref: AH/X003574/1). The project lead is Dr Eleonora Nicoletti from UWE Bristol (the University of the West of England).

The exhibition is the result of a series of events to co-create with the Bristol community visual concepts for integrating photovoltaics into facades of existing buildings in the Bedminster area. The artworks reflect thoughts from Bristol residents, architects and other built environment specialists, considering possibilities for integrating solar photovoltaic technology into building facades, which would in turn generate electrical energy from sunlight. Steve Hayles, co-founder of Upfest explains:

“Being able to give the local community the chance to come and see how the neighbourhood they’re living in could potentially be transformed is something we are all excited about. The public has a great opportunity to give feedback to artists to help shape and improve their surroundings.

“We hope the renewable energy concept of the designs is well-received by residents and Bedminster can be used as a launching pad for similar projects throughout the world – there’s no better place to start!”

Eleonora Nicoletti, the project lead, adds:

“If we think of integrating photovoltaic technology more into buildings to decarbonise the power grid, we need to engage the local community early on in reimagining their urban environment. With such a strong presence of street art in Bristol, we are looking forward to seeing which possible directions driven by the local community building-integrated photovoltaic (BIPV) technology may take in the city.”

Exhibition visitors from the broader Bristol community will be invited to join the conversation and express their views on the displayed work via an online questionnaire, to help shape the future direction of this progressive project. For further information follow Upfest’s social media channels FacebookInstagram and Twitter.

Redeemer City to City is an international non-profit organisation with a heart for urban renewal – seeking to recruit, train and resource leaders to start new churches and strengthen existing ones.

Studio Floc were invited to create the identity and event collateral for Redeemer’s ‘Hub Weekend’; a high-profile fundraising weekend based in New York City.

Campaign idea
Taking place at the 1 Hotel Brooklyn Bridge, the driving idea behind the event’s campaign was one of connection, with delegates travelling from all over the world to join for the weekend. Studio Floc used the idea of connecting people and creating paths to new places as the core concept. This was rolled out across an extensive design suite of event collateral which was used in the lead up and throughout the weekend.

Never ending connection
At the heart of the event’s design concept was a vast illustration, created in-house to capture the breadth and vitality of life in New York City, the home of Redeemer City to City. Subtle details in the cityscape worked to honour other global partner cities. The mural, formed from continuous line drawings, was then, paired with type and colour, used both in sections and as a whole piece across the event assets.

Colour and typography
Supporting the illustration-heavy campaign was a subtle, yet extensive typographic system that was driven by the elegant serif, Chronicle Text (Hoefler & Co). Alongside the typography, a stripped back colour palette of navy and alabaster were used as the foundation for every design.

At the event
As part of the event, Studio Floc recreated the core illustration and hand drew a 17ft x 9ft mural in the atrium of the 1 Hotel Brooklyn Bridge, as a visual centrepiece to the event. Other designed collateral at the event included; table numbers, name cards, place cards, menus, bespoke fabric napkins, tote bags, information booklets and cards, signage, wayfinding, video creation and much more.

The Hub Weekend was a great success in raising money for the continuation of Redeemer’s work in cities worldwide. Studio Floc are already working on the event design for the next Hub Weekend in 2024 and look forward to further collaboration with Redeemer City to City in the future.

“Studio Floc are my go-to designers for event collateral. They are creative, sensitive, timely, very fun to work with, and brilliant at bringing my often-incomplete vision to a finished, effective, beautifully designed product. I’ve already recommended them to others and will continue to do so.”

Susan Thorson
Manager, Communications
Redeemer City to City