Here’s the complete step-by-step guide to Spark AR Filter Creation, along with features, sample templates, usage, examples, etc:
With the advent of AI and automation, application development is made easy and accessible to everyone. People without technical or coding knowledge can create an application with the advent of Low code No code or Visual Programming language.
One should have logic in mind. Just by dragging and dropping the widget, the intended solution will arrive in no time. Spark AR is one such Low code or No code or visual programming platform kit, especially for Augmented Reality technology.
As the name suggests, this article provides a comprehensive guide to Spark AR filter creation step by step, so just by following this article, one can create your own Instagram filter.
Table of Contents:
What is Spark AR
Spark AR is an augmented reality platform tool kit introduced by Meta (Facebook) company to create AR filters. The mission of this toolkit is to democratize Augmented Reality for non-technical people, especially noncoding and artists, to enter the realm of AR. Have logic in mind, then drag and drop the graphical nodes to arrive at the overall solution.
Instagram AR filter creation software, Spark AR studio is available for download along with spark AR player to test the created AR filter during development time.
In this section, we will do a walkthrough of Spark AR editor UI.
New Project Creation Interface
Upon opening the downloaded Spark AR studio tool, the user is presented with a new project creation interface to create a blank project for type sharing experience or video calling experience.
Spark AR Sample Template
Spark AR studio offers users multiple predefined templates to choose from, which suit their ideas or thought processes. Some of the pre-defined available templates are makeup, object reveals, hair color, eye color, etc.
Basic Building Blocks of Spark AR
Basic building blocks that are available in Spark AR studios to create AR filters are Scene, Layer, Assets, Viewport, Mode, Filter, View, and Manipulators.
#1) Scene
To create an AR filter, all the assets are placed on a scene, and for a basic understanding of the scene is necessary.
Scene Understanding provides support for the following:
- Face Tracker: The Face Tracker feature tracks the face feature and its interaction with the AR filter.
- Plane Tracker: A plane tracker is basically a world tracker feature. On those surfaces are tracked and graphical elements are placed.
- Target Tracker: This feature handles the image recognition feature. In image recognition, the AR filter action will get triggered.
- Hand Tracker: This feature identifies and keeps track of hand-moving gestures and triggers AR filters based on movement.
#2) Layers
The Layer feature helps in rendering assets like 2D or 3D objects over the camera in a specific order. Objects closer to the camera will render in layer 0 than the other objects that are farther away from the camera.
#3) Assets
Assets play an important role in AR filter creation. Through this asset panel, all 2D and 3D assets, materials, textures, scripts, and common components like blocks are added as part of the AR filter.
#4) Viewport
Viewport is the central point of Spark AR studio. This is where one can design, edit, and view the AR filter that they are working on. Default components are available in the viewport, directly linked to the scene panel’s camera, and focal distance features.
#5) Manipulators
Manipulator features carry out edit operations on 2D/3D assets.
- Position: This feature will display the Axe icon on the selection of 2D/ 3D objects and facilitate the movement of the selected asset objects.
- Scale: Make the asset 2D/3D objects bigger or smaller via this feature.
- Rotate: This feature helps in rotating the 2D/3D asset objects along the x, y, and z axes, respectively.
- Local/Global: With this feature, the parent-child relationship between the asset objects is handled. With Local feature selection, the child object takes up the location coordinates of the parent. On Global feature selection, the child object takes up the world X, Y, and Z coordinate positions.
- Pivot/Centre: Manipulating the asset object either at the center or at the pivoted point is handled by this feature.
#6) Mode
The mode feature provides support to edit 2D, and 3D objects with X, Y coordinates and X, Y, and Z information in detail, respectively.
#7) Filter
This feature provides the option to add or remove some components necessary to create an AR filter.
#8) View
The View feature is only available in 3D object mode, which provides diverse options for viewing 3D objects via cameras assembled in the viewport. Orthographic projection, Perspective projection, Bird eye view, Left, Right, Front, Back, Top, and Bottom are diverse options available under the view feature.
#9) Emulator
Spark AR studio’s important feature is the emulator, which helps creators visualize the AR filter on their mobile device before it gets deployed on their Instagram or Facebook. It provides an option to simulate events like orbitual movements, touch gestures, and rotation.
The emulator offers facilities to simulate the AR filter on various iPhone and Android model phones.
How to Use Spark AR to Create an AR Filter
Coming up with an AR filter is a creative process that starts with Ideation, Concept, Draft, Design, Asset preparation, Development, Testing, and Publishing. Here in this section, we will focus on the development of AR filters.
This is an advertising campaign to celebrate the 75th Indian Independence Day. The concept of this AR filter is to show the Indian in you. So, the idea is to split the face and display the Indian Flag over the face AR filter user.
Step #1: Create New Project
Open Spark AR studio, there will be an option to Create a New project with either a Blank New template or with the available templates.
Step #2: Add a Face Tracker
The face tracker feature is the base feature of this AR filter, so add one face tracker, which tracks the gestures of the face. Add a face tracker from the menu by clicking Add -> Scene understanding -> Face Tracker or by hitting the + button in the scene tab, selecting Face Tracker, and pressing Insert.
Step #3: Add Face Mesh
Face mesh assets respond to various facial expressions, like opening the mouth, eye brew gestures, and so on. Face mesh is the child of the face tracker component. Add face mesh by right-clicking on the Face Tracker -> Add Object -> Face mesh. In this step, add three face meshes representing the right side, left side, and base face image.
Step #4: Add Null Object
The null object component is like a container, which groups the assets under one umbrella, so that any transformational updates like resizing, position, and rotation changes are done on one object will reflect on other related objects. Add Null object by a right click on Face Tracker -> Add Object -> Null object and then click insert.
Step #5: Group Face Mesh into the Null Project
Drag all the face mesh under the null project and pull the null project under the face tracker object.
Step #6: Give Name to Face Mesh and Apply Material
Provide a unique name for each face mesh object added to the face tracker. Create material for each face mesh by selecting materials under the properties tab of the selected object.
In the illustration below, each face mesh is given the names Img_back, Img_right, and Img_left. Each face mesh has its material by the name imgback_mat, imgright_mat, and imgleft_mat, respectively.
Step #7: Material Shader Type and Advance Render Options
For the created material, apply the shader type and set the texture. Let the shader type be flat. Material shader type is available on the right-side Inspector panel.
For this split face AR filer, the material’s advanced render options need to be customized by deselecting Use Depth Test and Write to Depth features available under the Advanced Render Options tab.
Step #8: Importing Image File
Import the image files from the local disk by clicking Add asset -> Import ->From computer. This split AR filter concept requires a couple of images, one for the base face image and the other for the left or right side of the face.
Import all the photos from your local computer disk. Drag and drop the image directly from the disk to spark the AR studio under the left-hand side asset panel.
Step #9: Face Tracker Texture Extraction
Select the texture extraction for the Face tracker component so that this will be the texture for the left and right-side face mesh in this AR filter. Select the Face tracker object, on the right side inspector panel, under the texture extraction section, and click + to select the texture.
Step #10: Apply Texture to the Material
Split face AR filter needs three face meshes, in that, one of the face mesh, Img_back configured with the imgback_mat material and Indian flag image filled as texture. The face mesh represents the left face and right face represented by the name Img_right, Img_left and it takes imgright_mat and imgleft_mat as material, respectively.
Since this is a split face AR filter, configure the left and right-side material texture as facertracker0 Texture on the inspector panel. The original image is set in texture under the Alpha tag and check the Invert feature option to show the half side of the face.
Step #11: Patch Editor and Logic Flow
In the concept of this split face AR filter, the user must tap on a mobile screen, and the face will split on both the left, right side, and base face shown with the Indian flag. Patch editors can achieve the entire logic. Click on the View option at the top menu bar and select Show Patch Editor to open.
Patch editor is a visual editor that supports visual programming with nodes and graphs.
Screen Tap, Switch, Pulse, Delay, Loop Animation, Transition, Multiply, and 3D Position nodes are used for this split screen AR filter. To achieve the split face logic, select the Position attribute of the Img_right and Img_left face mesh, then connect to the transition node in the visual editor.
Multiply the node with the X value set to -1 connected with Img_left face mesh and move the left side of the face towards the left direction.
Step #12: Test the Device Via Spark AR Player or Preview it on Facebook/Instagram Account
Created AR filters can be tested in two ways: One way is by installing the Spark AR player app on a mobile device and directly posting the AR filter under the preview option to Facebook or Instagram account, and both options are available in Spark AR studio.
How to Publish AR Filters in Meta and Instagram
Time to publish or upload the developed and tested AR filter on Instagram, Facebook account, or even both. Spark AR studio supports and eases submitting the same for approval. Click on the Publish button icon to initiate the process of AR filter submission.
Before publishing the AR filter, conditions like the prescribed file size limit and platform-specific operating capabilities are to be satisfied.
The allowed file size limit is 4MB for publishing on your Instagram account, whereas for Facebook, one has the liberty to have around 10Mb file size. Select View file size in the Platform requirements feature of the publishing window.
Manage capabilities deals with the option to select the features that are to be part of or features that would work for Instagram or Facebook accounts.
Once the file size requirements and platform capabilities are met, spark AR platform studio displays steps to publish the created AR filter for approval.
One can submit the AR filter as a new effect or an update to the existing effect. Demo video on the steps to execute the AR filer. Studio also offers the option to export or save the created AR file as arproj file format in a local drive.
Spark AR Hub and its Features
Spark AR hub is basically a dashboard that displays the created AR filters for the registered account, analytics insights, a gallery of AR effects, creators of other AR effects, learning tutorials, and effects that are specifically used for advertisements.
Under the Effects page, showcase the AR filters created by the creator with their visibility status, review status, and ways to share the AR filter on their Instagram or Facebook account.
The Insight page provides a detailed analytics view of created AR filters, audience-related details like gender, number of times AR filter opens, AR filter captures, filter share uses, and impressions.
How Brands Can Use Spark AR
Social media marketing has been a buzzword in recent days. Most brands increase their sales and target the mass market, their obvious way is to choose social platforms like Instagram, Facebook, WhatsApp, and Twitter.
AR filters created with Spark AR Studios are hosted on Instagram and Facebook as Instagram reels or Facebook ads. If the AR filter is compelling and has that wow factor, then it’s more likely to be shared among social media account users.
Social media platforms have major advantages in terms of marketing, with features like followers and influencers. They, in turn, would promote the brand product. So creating engaging and interactive AR filters followed by influencer promotion would increase brand awareness.
Having a mesmerizing and shareable AR filter by influencers and followers would fuel the growth of the brand product by word of mouth.
Spark AR hub provides analytics insights on the number of opens, captures, use of Instagram reels, and regional audience-specific details. With this information, brands can come up with amazing AR filters to market their reach.
Why Create AR Filters for Instagram Stories
Instagram stories are the perfect way to introduce the user to an AR filter and influence them to try it. The AR filter effect appears as an icon in the icon tray of the Instagram camera effect. When selecting the icon, AR filter magic happens.
Influencers can try the AR filter and post it as a story on their Instagram account to encourage followers to try this AR filter.
Suggested reading =>> How to download Instagram Videos
Examples of Spark AR Filters
Hand Gesture AR Filter
This AR filter was developed to raise awareness of COVID-19 vaccination and celebrate Diwali. Upon recognizing the hand gesture, the AR filter will come into effect.
Randomizer AR Filter
This AR filter was designed and developed to educate children about good touch and bad touch by conducting a quiz with images that would generate randomly above their heads.
Frequently Asked Questions
Q #1) Does Spark AR cost money?
Answer: Spark AR studio is a free source to download and create AR filters. If you have art and a little bit of technical knowledge to handle Photoshop or 3D software, you can create and publish AR filters.
Q #2) What programming language does spark AR use?
Answer: Spark AR studio was created to cater to the needs of non-technical people and artists to contribute creatively to AR technology. Spark AR Studio supports both visual programming and the traditional code programming language JavaScript.
Q #3) Can you use Spark AR without Facebook?
Answer: No, to create AR content, one should use spark AR studio which needs Facebook authentication, and to host the created AR content, one should have Facebook or Instagram. Without Facebook, it’s quite impossible.
Q #4) Is spark AR only for iOS?
Answer: Spark AR content works on both Android and iOS platforms. If Facebook or Instagram apps are installed on any of the mobile phone platforms, then one can view their created AR filter.
Q #5) What is an Instagram AR filter?
Answer: AR filters are an augmented reality effect created using Spark AR studios and hosted on social media accounts Instagram and Facebook, which are shared as Instagram stories, reels, and ads.
Conclusion
Spark AR and its studio is an excellent tool for creating Augmented Reality, especially for non-technical people.
Techies crave bringing technology and creativity together, and at the same time artists trying to bridge creativity with technology, but lacking tech knowledge can very well use this platform to highlight their talent.