The program has been tested by artists including Sofia Crespo, Scott Eaton, Alexander Reben, and Refik Anadol.
Vittoria Benzine,
Meta AI, the branch of Facebook’s parent company responsible for artificial intelligence, published a report this week about an exploratory research project it is working on called Make-A-Scene “that demonstrates AI’s potential for empowering anyone to bring their imagination to life,” according to a July 14 blog post.
Make-A-Scene aims to go one step further than the typical text to image generator by adding the option for users to draw a freeform digital sketch of a scene for the network to base its final image on.
“Our model generates an image given a text input and an optional scene layout,” the report reads. “As demonstrated in our experiments, by conditioning over the scene layout, our method provides a new form of implicit controllability, improves structural consistency and quality, and adheres to human preference.”
Images generated by Refik Anadol in collaboration with Meta’s Make-A-Scene program. Photo: courtesy of Meta.
Generative art is a longstanding, critical form, though the medium has recently enjoyed a heyday in the mainstream. Many might remember that a work created using AI by the French art collective Obvious fetched $432,500 at Christie’s in 2018. And new media pioneers like Herbert Franke and Refik Anandol have been employing the medium to create works of conceptual depth and poignancy.
“To realize AI’s potential to push creative expression forward, people should be able to shape and control the content a system generates,” Meta AI’s blog post reads. “It should be intuitive and easy to use so people can leverage whatever modes of expression work best for them.”
“Imagine creating beautiful impressionist paintings in compositions you envision without ever picking up a paintbrush,” the post continued. “Or instantly generating imaginative storybook illustrations to accompany the words.”
To fine tune their results, the team at Meta asked human evaluators to assess the images created by the AI program. “Each was shown two images generated by Make-A-Scene: one generated from only a text prompt, and one from both a sketch and a text prompt,” their post says.
Adding the sketch resulted in an image that was better aligned with the text description, 66.3 percent of the time. Meta notes that Make-A-Scene can also “generate its own scene layout with text-only prompts, if that’s what the creator chooses.”
Images generated by Sofia Crespo in collaboration with Meta’s Make-A-Scene program. Photo: courtesy of Meta.
The software hasn’t gone live to the general public. So far, it has only been rolled it out to a few employees and select AI artists, like Sofia Crespo, Scott Eaton, Alexander Reben, and, naturally, Refik Anadol. “I was prompting ideas, mixing and matching different worlds,” Anadol remarked of his experience using the program. “You are literally dipping the brush in the mind of a machine and painting with machine consciousness.”
At the same time, Andy Boyatzis, a program manager at Meta, “used Make-A-Scene to generate art with his young children of ages two and four. They used playful drawings to bring their ideas and imagination to life.”
Since the report was released, Meta has increased the potential resolution of Make-A-Scene’s output four-fold, to 2048 x 2048 pixels. They’ve also promised to provide open access to demos moving forward. For now though, they advise to keep an eye out for more details during their talk at the European Conference on Computer Vision (ECCV) in Tel Aviv this October.

Share
By Eileen Kinsella
By Jo Lawson-Tancred
By Artnet News
By Devorah Lauter
By Guy Richards Smit
©2022 Artnet Worldwide Corporation. All Rights Reserved. ';
$('body').append(ouibounceScript);

// Add animation css
addCss('https://cdnjs.cloudflare.com/ajax/libs/animate.css/3.5.2/animate.min.css');

if (generalSettings.loadFontAwesome) {
addCss('https://maxcdn.bootstrapcdn.com/font-awesome/4.7.0/css/font-awesome.min.css');
}

// Check if ouibounce exist before calling ouibounce
var initOuibounce = setInterval(function() {
if (typeof ouibounce !== 'undefined') {
appendNewsletterSignup();

var $modal = $('#ouibounce-modal');
SignupForm.init($modal.find('form'), function onSuccess() {
//hide form fields and show thank-you message
$modal.find('.form-row').hide();
$modal.find('.newsletter-signup-thank-you').fadeIn('fast');

setNewsletterCookie('signedUp', 1);

//after successful signup, hide the signup bar after 5 seconds
setTimeout(function() {
closeSignupBar();
}, 5000);
});

// Handler for close signup button
$('body').on( 'click', '.close-signup', function(){
setNewsletterCookie('closedSignupBar', 1);
closeSignupBar();
});

ouibounceAPIaccess = ouibounce(
$modal[0], {
aggressive: true,
sensitivity: 50,
callback: function() {
slideInModal('Down');
}
}
);

clearInterval(initOuibounce);
}
}, 100);
}

function slideInModal(upOrDown) {
$('#ouibounce-modal')
.removeClass('slideOutDown slideOutUp')
.addClass( 'slideIn' + upOrDown );

setNewsletterCookie('recentlyShown', 1);
}

function setNewsletterCookie(cookieName, value) {
//exdays*24*60*60
var settings = cookieSettings[cookieName];
var expirationMinutes = settings.expiration_minutes;
if (!expirationMinutes) {
expirationMinutes = daysToMinutes(settings.expiration_days);
}
setCookie(cookieName, value, expirationMinutes);
}

function daysToMinutes(numDays) {
return numDays * 24 * 60;
}

/**
* Generic setCookie() method, used by setNewsletterCookie().
* There is probably no need to call this directly - use setNewsletterCookie().
*/
function setCookie(cname, cvalue, expMinutes, prefix) {
//default prefix is 'artnet_newsletter_'
if (prefix == undefined) {
prefix = 'artnet_newsletter_';
}
var d = new Date();
d.setTime(d.getTime() + (expMinutes*60*1000));
var expires = "expires="+d.toUTCString();

//console.log(prefix + cname + "=" + cvalue + ";" + expires + ";path=/");
document.cookie = prefix + cname + "=" + cvalue + ";" + expires + ";path=/";
}

function getCookie(cname, prefix) {
//default prefix is 'artnet_newsletter_'
if (prefix == undefined) {
prefix = 'artnet_newsletter_';
}
var name = prefix + cname + "=";
var ca = document.cookie.split(';');
for(var i = 0; i
artnet and our partners use cookies to provide features on our sites and applications to improve your online experience, including for analysis of site usage, traffic measurement, and for advertising and content management. See our Privacy Policy for more information about cookies. By continuing to use our sites and applications, you agree to our use of cookies.
You are currently logged into this Artnet News Pro account on another device. Please log off from any other devices, and then reload this page continue. To find out if you are eligible for an Artnet News Pro group subscription, please contact [email protected]. Standard subscriptions can be purchased on the subscription page.
Log In

source