Featuring the Revolutionary ‘Zoom Out’ Function, Overhauled Aesthetic System, and High Variation Mode for Unparalleled Artistic Expression
Midjourney update wows AI artists with camera-like feature
Midjourney’s fresh out of the oven AI model, version 5.2, now boasts a snazzy “zoom out” feature. It’s like holding a camera and pulling back on the lens, all while crafting a larger scene around your original picture. A nod to OpenAI’s DALL-E 2 and its outpainting antics, with a twist: it only works on images baked inside Midjourney’s own oven. Custom image extension? Not on the menu.
Midjourney’s preferred playground is the Discord server, where users can try this zoom out magic, punching up images with “Zoom” buttons and the “Make Square” option. It’s all about experimenting with size, whether you want to go for 1.5x, 2x, or somewhere in between.
Midjourney’s improvements are being hailed as “stunning” and “mindblowing”, which are high praise even in the constantly amazed world of AI. That said, not everyone’s throwing confetti at the AI image synthesis party. There’s a cloud over how these systems are trained, essentially snatching millions of images from the web without a by-your-leave. Midjourney’s been mum on exactly what’s in their data soup. Adobe’s making a run for the ethical line with Firefly, but active artist consent is still on the skinny side.
Toyota Research Institute unveils generative AI-powered vehicle design tool
Toyota Research Institute (TRI) have whipped up a new tool that’s jazzing up the car design process. Powered by “generative AI”, it helps designers sketch out cars in response to simple words like “sleek” or “SUV-like”.
Ordinarily, designers would have to juggle between pretty drawings and hard engineering facts, like how aerodynamic a car is, or how tall it can be. This often ends in a tug-of-war between design and engineering, making the process longer than a cross-country road trip. But with this new AI tool, it’s a little like having a nifty assistant that keeps both in check, making sure a beautiful car doesn’t overlook the nitty-gritty of engineering.
What’s more, it seems to be quite the crowd-pleaser. Designers are digging the chance to play around with ideas, knowing that the tool has got their back on the technical side of things. It’s still in the research phase, but there’s talk of getting it integrated into Toyota’s design process, meaning we might see the fruits of this AI labour on the roads sooner than you think.
In the long run, they’re hoping that this tech can kick open the doors for designers everywhere, letting them get crazy creative while also speeding up the design development process. As our pal at Toyota, Avinash Balachandran, puts it, this tool’s all about “amplifying people”. A new age of car design might just be around the corner, folks.
Social scientists look to AI models to study human behavior
Social scientists are playing with the idea of using AI chatbots like ChatGPT to dig into the human psyche. The basic idea? If chatbots can spin out everything from cover letters to code, perhaps they can help us understand human behavior a bit better, without the usual time, cost, and potential awkwardness of human experiments.
The argument? Since these AI systems are fed on a banquet of diverse human text data, they may be able to offer a wider range of human perspectives than traditional methods. And while chatbots might not be people, they might be able to help generate hypotheses that could be tested on real humans.
In an intriguing twist, one study found that the AI’s responses to moral scenarios had a whopping 95% correlation with human ones, making us wonder if we need humans for these types of evaluations anyway.
However, critics argue that these AI systems are just parroting back what they’ve been trained on, rather than demonstrating genuine understanding. The worry is that these models might create echo chambers, where fringe and minority opinions get left out. There’s also concern about how accurately the AI represents the patterns of relationships among ideas, attitudes, and contexts in human populations.
Prophecy’s generative AI assistant ushers in a new era of data pipeline automation
Prophecy, a data engineering startup from sunny California, is looking to kick data pipeline creation up a notch, turning it from a boring chore into a walk in the park. Its latest creation, the data copilot, is an AI assistant that not only speaks human but also whips up data pipelines using natural language prompts.
The upside of this AI butler is that it can free up data engineers to tackle other pressing tasks, rather than having them chained to their desks writing complex SQL code. It’s a move up from Prophecy’s earlier offerings that let users drag and drop their way around a visual canvas to create data pipelines.
Prophecy’s CEO, Raj Bains, spills the beans, explaining that they build a comprehensive knowledge graph containing all sorts of data. It’s like a treasure map of company data, encompassing technical metadata and business metadata, to the nitty-gritty details of executed queries and code. This data feast is then fed to a highly sophisticated language model that translates user’s everyday language into a data pipeline. So, you talk, it listens and boom – you’ve got a pipeline.
And Prophecy is not stopping at just the data copilot. They’re also offering a platform that enables companies to build AI solutions, like chatbots, on top of their private, enterprise data. Think of it as putting a chatbot on steroids, with the ability to delve into internal messaging systems, documents, support tickets and more to provide relevant answers to user queries.
AI video creation app Captions bags $25M from top VCs
Captions, the video-creating brainchild of Gaurav Misra, just nabbed a cool $25 million from some big-time investors. We’re talking about folks like Kleiner Perkins, Sequoia Capital, Andreessen Horowitz, and SV Angel.
Once a simple app for chatty videos, Captions has beefed up into a full-blown creative suite. This tool’s got a lot of bells and whistles – it’s got AI translating your words into different languages while keeping your voice, human-like AI voiceovers, AI creating snappy short clips from longer ones, and even AI composing music for your vids.
The big news here is that Captions is now running on OpenAI’s GPT-4 and a neat eye contact feature made in partnership with Nvidia. The AI gives the illusion that a video host is looking directly into the camera, even when they’re reading from a script. Seems simple, but apparently, it’s been a big hit since it was launched.
Investors are praising Captions for being the spearhead in AI-powered video content creation. Misra’s got a knack for understanding creators, and his success with Captions is a testament to his abilities. With all the hype and money flowing in, it looks like Captions is paving the way in the brave new world of AI video production. Stay tuned to see what else they whip up next.
Talent Select AI automatically screens job candidates
Talent Select AI, a company based out of Milwaukee, wants to shake things up a bit. They’ve been working on a tool that uses natural language processing, a fancy term for how a computer understands human speech. It listens in during your job interview and, like a detective with a magnifying glass, looks at your choice of words to figure out if you’re a fit for the job.
Talent Select AI plans to offer this as a user-friendly software on their website. According to their CTO, Will Rose, they only look at the words and ignore the way you say them or your appearance, keeping things fair across different cultures.
Initial results show a 50% drop in hiring time, a hefty 80% increase in hiring from underrepresented groups, and 98% of users feeling more confident in their selection. The science behind all this, psychometrics, has had its ups and downs since its birth at the University of Cambridge back in 1887. Critics question its reliability and ethical implications, especially when it comes to making serious decisions, like hiring. Even the creators of the Myers-Briggs test agree it shouldn’t be used for hiring decisions.
Chip Startup Rides AI Frenzy to Become Top-Gaining Topix Stock
Socionext Inc., a Japanese chip designer, has seen its shares skyrocket over five times since its public listing in October, riding high on the wave of the anticipated silicon demand driven by AI. The company has grown by over 245% this year alone, hitting a market value of around $6.7 billion and earning the title of the top performer on the Topix index.
Birthed in 2015 from a merger between the systems-on-chip (SOC) divisions of Fujitsu Semiconductor Ltd. and Panasonic Holdings Corp., Socionext creates tailor-made modules for an array of sectors including consumer, automotive, and industrial fields. Unlike big guns like Nvidia and Advanced Micro Devices Inc., Socionext only develops and produces chips when a customer places an order.
OpenAI PPUs: How OpenAI’s unique equity compensation works
OpenAI, a tech giant in the AI industry, keeps its pay structure close to its chest. But deep-dive reveals they use an intriguing mix of base salary and ‘Profit Participation Units’ (PPUs).
Originally, OpenAI was a nonprofit. In 2019, they switched to a ‘capped profit’ model to fund their lofty goal of developing Artificial General Intelligence (AGI). Basically, their investors and employees get a capped return if they hit their targets. Anything over that goes back to the nonprofit side of OpenAI.
How does pay work at OpenAI? Well, you have your regular paycheck, and then there’s the PPUs. It’s like equity but based on company profits, and if the company doesn’t make a profit, your PPUs might as well be Monopoly money. But if the company does make a profit, you get a slice of the pie based on your PPUs. Also, the tax man only takes his cut when you sell or receive profit, not when you get the units.
OpenAI’s PPUs are almost like the Profit Interest Units (PIUs) other startups give out. The company hands out PPUs when they make an offer, and these vest over four years. The kicker is, you don’t know how many PPUs you’re getting or how many exist in total. So you don’t know if you’re getting a sliver or a slice of the profit. Also, you can’t sell your PPUs in the first two years and there’s a growth cap at 10x the original value. Along with these PPUs, OpenAI offers benefits like medical, dental, mental health benefits, 401(k) with 4% matching, unlimited time off and more.
Meet The AI Protest Group Campaigning Against Human Extinction
Joep Meindertsma, a 31-year-old Dutch database company owner, believes AI could lead to societal collapse and perhaps even human extinction. His distress has led him to form “Pause AI”, a protest group advocating for a stop in AI development.
Meindertsma’s concerns aren’t entirely unheard of. AI bigwigs, such as Geoffrey Hinton, have warned about AI risks, while a rising number of people reportedly fear an AI-induced apocalypse.
Meanwhile, Meindertsma and his followers have staged small protests in several cities worldwide. He’s also managed to chat with officials in the Dutch Parliament and the European Commission. Their demands? An international halt on AI development until we learn how to construct it safely.