The difference we have with ChatGPT is that it doesn’t so much present a threat to the university experience, but rather directly into the heart of the purpose of a university education – its ability to ‘teach you how to think’. There have been shadows of this in the past, for instance the hostility towards Wikipedia (Coomer 2013), the emergence of essay mills, not to mention simple, now common place tools such as spell checkers and calculators. I remember vividly a very angry professor in the early 2000s telling me that reading lists with hyperlinks would make students baby birds, with wide open mouths expecting to be spoon fed. We’ve pretty much moved through all those advancements in technology and realised their benefits, but this one, I would argue, is different. Not because it does not have its benefits, but because of the sheer volume and scale of what’s coming will be meaningfully different and ultimately challenge the foundations upon which we measure that ability to think – university assessment.


Learn more / En savoir plus / Mehr erfahren:


Read the full article at:

On Thursday, Microsoft researchers announced a new text-to-speech AI model called VALL-E that can closely simulate a person’s voice when given a three-second audio sample. Once it learns a specific voice, VALL-E can synthesize audio of that person saying anything—and do it in a way that attempts to preserve the speaker’s emotional tone.

Its creators speculate that VALL-E could be used for high-quality text-to-speech applications, speech editing where a recording of a person could be edited and changed from a text transcript (making them say something they originally didn’t), and audio content creation when combined with other generative AI models like GPT-3.


Learn more / En savoir plus / Mehr erfahren:



Read the full article at:

Combining Wolfram Alpha and GPT-3 could potentially create a powerful tool that could answer complex questions and generate human-like text. It could be used to provide in-depth explanations and discussions on a wide range of topics, and could be used to engage in natural-sounding conversations.


Accessing Wolfram|Alpha’s computational knowledge with ChatGPT–an ideal combination of precise computation with human-like expression of ideas. Stephen Wolfram explains how.


It’s always amazing when things suddenly “just work”. It happened to us with Wolfram|Alpha back in 2009. It happened with our Physics Project in 2020. And it’s happening now with OpenAI’s ChatGPT. Stephen Wolfram has been tracking neural net technology for about 43 years and he finds the performance of ChatGPT thoroughly remarkable. Suddenly, there is a system that can successfully generate text about almost anything—that’s very comparable to what humans might write. It’s impressive, and useful and its success is probably going to tell us some very fundamental things about the nature of human thinking.


But while ChatGPT is a remarkable achievement in automating the doing of major human-like things, not everything that’s useful to do is quite so “human like”. Some of it is instead more formal and structured. And indeed one of the great achievements of our civilization over the past several centuries has been to build up the paradigms of mathematics, the exact sciences—and, most importantly, now computation—and to create a tower of capabilities quite different from what pure human-like thinking can achieve.


Read the full article at:

Over the recent weeks, millions of people have tried the new AI chat released by OpenAI, built on an upgrade of GPT3 (Generative Pre-trained Transformer). The tool uses a neural network to generate responses from data sourced from the internet. OpenAI, supported by Microsoft, also built and released the currently free DALL-E – an AI-generated art form.


By creating an easy user interface, the ChatGPT likely has many educators wondering about the future of learning. This platform will be rapidly improved when next-generation GPT4 models emerge, most likely early 2023 – meaning, it’s only going to get even better, much, much better.


AI already does and will continue to impact education – along with every other sector. Innovative education leaders have an opportunity (along with parallel emerging innovations in Web3) to build the foundation for the most personalized learning system we have ever seen. Using these tools, educators can design an equitable and efficient model for every learner to find purpose and agency in their lives – and the opportunity to help solve some of the world’s most pressing challenges.

Read the full article at:

Created by AI

Smart summary:
The thread discusses the upcoming “Prompt Palettes” feature for ChatGPT, which will provide users with pre-written text prompts to help with tasks like formatting raw text, summarizing text, and serving as a programming assistant. The feature will use OpenAI’s GPT-3 Codex models, and is similar to a “brushes” feature being developed by Github Copilot Labs.

ChatGPT will soon drop a new feature – Prompt Palettes!

They’re pre-written text prompts to perform tasks like
– format raw text to markdown
– summarize text
– be a programming assistant
– add text from a link as context

How it works and EXACT prompts are pre-written bits of text that accentuate a user’s input for a specific task.

It’s like a magic button for text explained only by pre-written instructions. Here are the prompts to “format” and “summarize”. You should be able to add your own too.

— ChatGPT Coding Assistant —

is slightly different from Prompt Palettes but seems like a pre-written addendum to a query to serve as a one-shot way to focus on a specific vertical task.

This might be forcing the use of GPT-3: Codex explicitly–

Prompt palettes act on a specific message, and ChatGPT is adding a nifty “Add text from link” feature which will allow you to, say, summarize websites easily.

Prompt palettes aren’t new! Github Copilot Labs has been working on a similar magic “brushes” feature that integrates directly into VS Code. They use OpenAI’s GPT-3 Codex models too.

Prompt Palettes will bring these powerful new LLM features to the mainstream of 1 million users! 2023 has just begun for AI.

Thanks to @eeeziii for the idea of reverse-engineering ChatGPT (he did this too)!

ChatGPT is a minified React app with chunked JS that uses Server Sent Events with /conversations to stream the meat of the output. It uses text-davinci-002-render with 4097 max tokens.

Read the full article at:

Psycholinguist Giosuè Baggio sheds light on the thrilling, evolving field of neurolinguistics, where neuroscience and linguistics meet.


What exactly is language? At first thought, it’s a continuous flow of sounds we hear, sounds we make, scribbles on paper or on a screen, movements of our hands, and expressions on our faces. But if we pause for a moment, we find that behind this rich experiential display is something different: the smaller and larger building blocks of a Lego-like game of construction, with parts of words, words, phrases, sentences, and larger structures still.


We can choose the pieces and put them together with some freedom, but not anything goes. There are rules, constraints. And no half measures. Either a sound is used in a word, or it’s not; either a word is used in a sentence, or it’s not. But unlike Lego, language is abstract: Eventually, one runs out of Lego bricks, whereas there could be no shortage of the sound b, and no cap on reusing the word “beautiful” in as many utterances as there are beautiful things to talk about.

Read the full article at:

Welcome to the “Awesome ChatGPT Prompts” repository! This is a collection of prompt examples to be used with the ChatGPT model.


The ChatGPT model is a large language model trained by OpenAI that is capable of generating human-like text. By providing it with a prompt, it can generate responses that continue the conversation or expand on the given prompt.


In this repository, you will find a variety of prompts that can be used with ChatGPT. We encourage you to add your own prompts to the list, and to use ChatGPT to generate new prompts as well.


To get started, simply clone this repository and use the prompts in the file as input for ChatGPT. You can also use the prompts in this file as inspiration for creating your own.


We hope you find these prompts useful and have fun using ChatGPT!

View on GitHub

View on Hugging Face

Read the full article at:

Want to give ChatGPT a try? Here are 10 cool things you can do with ChatGPT including writing music, debugging code, and more.


Do you know about ChatGPT? It’s a powerful and versatile language processing tool that can do some pretty cool things. From having a conversation with a virtual assistant to generating text based on a prompt, ChatGPT can be used for a wide range of applications. In this article, we’ll explore some of the cool things you can do with ChatGPT and show you how it can benefit you and your business. Whether you’re a beginner or an experienced user, we’re sure you’ll be impressed by the capabilities of this AI conversational bot. So let’s dive in and discover the cool things you can do with ChatGPT.

Read the full article at:

Our society faces the grand challenge of providing sustainable, secure, and affordable means of generating energy while trying to reduce carbon dioxide emissions to net zero around 2050. To date, developments in fusion power, which potentially ticks all these boxes, have been funded almost exclusively by the public sector. However, something is changing. Private equity investment in the global fusion industry has more than doubled in just one year – from US$2.1 billion in 2021 to US$4.7 billion in 2022, according to a survey from the Fusion Industry Association. So, what is driving this recent change? There’s lots to be excited about.


Read the full article at:

The problem with trying to fly through a gas giant like Jupiter is that “the density, pressure and temperature all increase to such enormous levels as you penetrate down into the interior that it is impossible the penetrate any probe through. Near the center of Jupiter, the normally gaseous hydrogen turns into a liquid metal, making this region as exotic as the surface of the sun. To give a sense of the pressure near the center of Jupiter, consider the Mariana Trench on Earth, the deepest place in our oceans. At nearly 7 miles (11 km) deep, pressures reach just over 1,000 bars (100,000 kilopascals), which would feel like 8 tons of pressure per square inch (703 kilograms per square meter). At sea level, you experience about 1 bar of pressure (100 kilopascals). Near the center of Jupiter, pressures jump to megabars, or one million bars, Fletcher said. On top of those enormous pressures, temperatures also rise into the tens of thousands of Kelvins, which is equivalent to tens of thousands of degrees Celsius. At that point, any spacecraft wouldn’t be just squished or melted — it would entirely disintegrate into its constituent atoms.


Read the full article at: