skip to Main Content

Observations from the Field: The Future of Generative AI

We believe that in this moment of generative AI hype, nothing is more valuable than hearing directly from entrepreneurs and product leaders building in the Generative AI space. That’s why last month we hosted a panel with a group of entrepreneurs and AI practitioners to discuss the key challenges entrepreneurs are facing when it comes to building category-defining generative AI companies.
Read more

Foundation Models Are The New Public Cloud

Over the last few years, building an AI startup used to require “do-it-yourself AI,” which consisted of gathering training data, labeling it, architecting complex data transformations, tuning hyperparameters, and selecting the right model. It was a herculean task, similar in complexity to the workload of the Salesforce engineer above.  But in the last year or two, foundation models have emerged as a time-saving shortcut that enable entrepreneurs to do more faster. These foundation models aren’t specific to particular AI use cases, but are largely general and have something to offer almost anyone. Entrepreneurs can now decouple parts of the training data and model (which comes pre-packaged in a foundation model) from the application layer, which we at Scale call a cognitive application.  
Read more

Natural Language Generation

Writing this content is a desk worker’s low-skill labor, demanding little skill but lots of time. (The quality of GMail autocomplete, introduced in 2018, illustrates just how repetitive business writing is.) It’s this attractive target that makes Natural Language Generation (“NLG”) products so exciting, because this new technology has finally grokked the patterns interwoven in our prose. NLG products are newly feasible, enabled by linguistic “transformer” models like GPT-3 from OpenAI and Jurassic-1 from AI21 Labs.
Read more
Back To Top