Quick Study! AI in a Nutshell

AI literacy is essential for both students and educators. But you don't have to know everything in order to get started. If you only have a few minutes, check out the following basics to get you going. If you have more time, check out this more extensive exploration of AI for educators or refer back to a list of essential AI vocabulary.


Decorative image

AI Video Resources


Nitty Gritty: Essential AI for Educators

What is AI? 

Artificial Intelligence (AI) is like a super-smart computer program that can learn, reason, and perform tasks that typically require human intelligence. It's designed to process vast amounts of information, recognize patterns, and make decisions or predictions based on that data. 

Is AI new? 

Not really! AI has been around since the 1950s, but it has made huge leaps in recent years. What's new is how powerful and accessible it has become, thanks to advances in computing power and the sheer amount of data available. 

Decorative image
Why am I just hearing about AI now? 

AI has recently hit a tipping point where it's become really good at tasks like understanding and generating human language. This has made it relevant to everyday life and work, sparking widespread interest and discussion. 

Briefly, how does AI work? 

AI works by learning from large amounts of data. It uses complex algorithms to find patterns in this data and then applies what it's learned to new situations. For language-based AI, it predicts what words or ideas are likely to come next based on the context it's given.  

What is a GPT, and how is it different than ChatGPT? 

GPT stands for Generative Pre-Trained Transformers.  

  • Generative means it can create new content, like text or code.
  • Pre-trained means it's learned from a vast amount of existing data. 
  • Transformer refers to the specific type of AI architecture it uses. 

In simple terms, a GPT is a type of AI model that's really good at understanding and generating human-like text. Think of a GPT like a super-advanced autocomplete. Instead of simply guessing the next word in a text message, it can generate entire paragraphs or even long documents that make sense in context. It is this ability that makes GPTs so powerful and versatile for all kinds of language-related tasks. 

GPT is the technology behind many popular AI chatbots and writing assistants, not just ChatGPT. ChatGPT is simply one AI language model created by the company OpenAI. 

Decorative image
What's the difference between ChatGPT, Claude, and Gemini? 

ChatGPT, Claude, and Gemini are all Large Language Models (LLMs) created by different companies. While they serve similar purposes (engaging in conversation, answering questions, helping with tasks), they can differ in their capabilities, knowledge base, and the specific ways they've been trained. It's a bit like using different smartphones. It’s the same basic idea, but each has its own features and strengths. 

  • ChatGPT (developed by OpenAI) 
  • Claude (developed by Anthropic) 
  • Gemini (developed by Google) 

New tools and developments are emerging every day in this competitive field, so it's best to keep your ear out for new terms that come up repeatedly.

What is prompt engineering and why is it important? 

Prompt engineering is the art and science of crafting effective instructions or questions to get the best results from AI. It's important because the quality and specificity of your prompt greatly impacts the AI's output. Good prompt engineering helps you harness the full potential of AI tools, making them more useful and accurate for your specific needs. 

In general, the best piece of advice for prompt engineering is to have a conversation. AI is not Google. You can ask one-off questions, but AI works best when you view it as a thought partner and pose a back-and-forth dialogue that helps elevate your work and ideas, not use it to replace learning or critical thinking.

To learn more, check out our Prompt Engineering Basics overview.

How reliable is AI-generated information? 

AI-generated information can be impressively accurate, but it is not infallible. To understand its reliability, it helps to know how AI models are trained. 

Large Language Models (LLMs) like GPTs are trained on vast amounts of text data from the internet and other sources. They learn patterns and relationships in this data, which allows them to generate human-like text. However, this process has some limitations, including:

  1. Outdated Information: The AI's knowledge is limited to its training data, which currently has a cutoff date. It doesn't automatically update with current events. 
  2. Hallucinations: Sometimes, AI can generate plausible-sounding but incorrect information. This is called a "hallucination." It happens when the AI makes connections that aren't accurate or fills in gaps with made-up details. 
  3. Bias: The training data can contain human biases, which the AI might learn and reproduce. This could lead to biased or unfair outputs. 
  4. Lack of True Understanding: While AI can process and generate text impressively, it doesn't truly understand context or have real-world experience like humans do. 
  5. Inconsistency: AI might give different answers to the same question asked in different ways. 
Decorative image

Because of these factors, it is crucial to approach AI-generated information with a critical eye. Use AI as a helpful starting point or brainstorming tool, but always verify important information with reliable sources. For educators and students, this presents an excellent opportunity to practice and teach critical thinking skills. 

Remember, AI is a powerful tool, but it's not a replacement for human knowledge and judgment. Your expertise, critical thinking, and real-world understanding are essential in interpreting and using AI-generated information effectively. Together, we can do incredible things. 


Learn More 

Curious for more? Excellent! Check out a more in-depth analysis of what AI is and how it impacts students and educators, or explore the essentials of prompt engineering.