Mathematics

  1. Bloom’s Taxonomy Needs an Update for the AI Age (Opinion) — The article argues that Bloom’s Taxonomy still helps educators think about cognitive demand, but it needs to be reimagined for the AI era because generative AI changes how students create, evaluate, and revise work. The author says teachers should help students build foundational skills without AI while also learning to direct and critique AI-assisted learning.

    Since ChatGPT’s public release in November 2022, the rapid advancement of generative artificial intelligence is reshaping the landscape of teaching and learning. Tools that instantly generate text, images, and other products have created an environment where thinking and creativity can be easily outsourced to machines, often leaving educators questioning the authenticity of student work and asking which cognitive skills will be most important to us as learners and doers.

    Teachers are right to be concerned about our future as creative, independent thinkers and problem solvers.

    Bloom’s Taxonomy has long been a tool educators could use to identify levels of cognitive demand in the classroom.

    Originally developed in 1956 and revised in 2002, the framework provides educators with shared language for curriculum and assessment design.

    It organizes learning from lower-order to higher-order thinking skills, starting with foundational skills like remembering and understanding and progressing through the increasingly complex ones of applying, analyzing, evaluating, and creating.

    However, generative AI’s ever-growing presence raises important questions for educators: Does this hierarchical framework still reflect the mental skills teachers should be cultivating in their students ?

    And should we perhaps abandon Bloom’s framework altogether?

    Before generative AI, creation—synthesizing ideas from one’s own knowledge and experiences into a final product—was designated the pinnacle of cognitive complexity.

    Now, a human author needs only an effective prompt to almost instantly create text, images, video, code, or data analysis.

    Creation occurs early in the process rather than as a culminating step.

    In fact, the traditional model of moving from the lower-order thinking skills to the higher-order ones does not align with how today’s learners interact with generative AI.

  2. How Teens and Young People Use AI Tools for Learning and Mental Health Support — Two new reports suggest that teens and young adults are using generative AI in very different ways, from learning and daily use to emotional support and mental health help. The findings also show that some young people turn to AI when professional care or adult support is hard to access, raising concerns about safety, trust, and how schools should respond.

    Two new research reports highlight growing concerns about the impact of AI use on teens and young adults’ mental health and how they use the technology to address those challenges.

    The reports—both produced by Youth Futures, a nonprofit that is dedicated to improving the wellbeing of young people in the age of AI; Surgo Health, a public benefit corporation focused on healthcare access; and the JED Foundation, a nonprofit which supports emotional health and suicide prevention for teens and young adults—examine topics that are top of mind for K-12 educators as schools expand their use of artificial intelligence.

    Both reports drew on survey data from 1,340 people ages 13 to 24.

    One of the two reports—Youth Mental Health in the AI Era: Why Context Matters More Than Technology —outlines six different ways that teens and young adults are engaging with AI. While the majority are using it to learn and grow, 9% have been identified as “emotionally entangled superusers,” or those who are emotionally vulnerable and turn to AI for emotional connection.

    The second report—Youth Mental Health in the AI Era: How GenAI Enters Help-Seeking Pathways —found that many teens and young adults who experienced mental health problems actively used GenAI to address those challenges. It’s worth noting that those who used GenAI to address mental health challenges were more likely to experience barriers to accessing quality professional care, according to the report.

    “Young people aren’t a monolith and their relationship with AI reflects the broader context of their lives—their relationships, their stressors, and their access to support systems and caring adults,” said Kristine Gloria, chief operating officer and co-founder of Young Futures, in a press release that accompanied the reports.

  3. How math logic puzzles can get students ‘proof-ready’ — Math logic puzzles can help students build deductive reasoning and “proof-readiness” by giving them playful, collaborative ways to solve problems before they reach geometry. The article says puzzles like Sudoku and Shikaku can show that there is often more than one strategy to reach a correct answer, and they can help students who may not shine on traditional tests find success in math.

    Dive Brief:

    Math logic puzzles can prompt students to work together in a way that incorporates play and critical thinking while easing the usual pressures and daily routine of solving standard math problems, math education experts say.

    Deductive reasoning is a step-by-step process for using information to find a unique solution to such puzzles. But students often don't exercise that skill in traditional math curriculum until students take geometry in high school and are required to use it to solve proofs, said Jeff Wanko, dean of the College of Education and Health Sciences and a math education professor at Bradley University in Peoria, Ill..

    Organizations such as the National Council of Teachers of Mathematics, as well as other educators, say waiting that long does students a disservice, Wanko said. “I see these puzzles as one way we can develop what I call ‘proof-readiness.’”

    Dive Insight:

    Sudoku, invented in the U.S. in the late 1970s but renamed and popularized by the Japanese puzzle magazine Nikoli, is probably the most well-known of these math puzzle games. But there are others — many of which originated in Japan — including Shikaku, Kakuro and Hashiwokakero, said Wanko.

    “One of the coolest things is, for every puzzle I have ever seen, there’s not just one way to get to that unique solution," Wanko, who taught at Miami University of Ohio for 25 years before working at Bradley this school year, said. "There are a lot of problem-solving strategies.”

    For example, Shikaku, one of Wanko’s go-to puzzles, involves numbers on a grid similar to Sudoku, except the puzzle-solver needs to draw a rectangle around each set of numbers with a surface area that corresponds to the numbers inside — and on top of that, the rectangles need to fit together like a puzzle.

    “All the spaces in the grid get used,” he said. “It’s not just guess-and-check, put a ‘2’ in this box. That’s not deductive reasoning. With real deductive reasoning, I’ve eliminated the other possibilities, and only a ‘2’ can go in this box.”

  4. The quest to build a better AI tutor — The article says researchers are still trying to improve AI tutors, and one promising approach is adjusting the difficulty of practice problems to match each student’s progress. In a study of Taiwanese high school students learning Python, the personalized problem sequence led to better exam results and more practice time, though the piece notes that human support may still be needed for less motivated students.

    It’s easy to get swept up in the hype about artificial intelligence tutors. But the evidence so far suggests caution.

    This story also appeared in Mind/Shift

    Some studies have found that chatbot tutors can backfire because students lean on them too heavily, get spoonfed solutions and fail to absorb the material. Even when AI tutors are designed not to give away answers, they haven’t consistently produced better results than learning the old-fashioned way without AI.

    Still, researchers who have produced these skeptical studies haven’t given up hope. Some are still experimenting, trying to build better AI tutors.

    One promising idea has less to do with how an AI tutor explains concepts and more with what it asks students to practice next.

    A team at the University of Pennsylvania, which included some AI skeptics, recently tested this approach in a study of close to 800 Taiwanese high school students learning Python programming.

    All the students used the same AI tutor, which was designed not to give away answers.

    But there was one key difference. Half the students were randomly assigned to a fixed sequence of practice problems, progressing from easy to hard.

    The other half received a personalized sequence with the AI tutor continuously adjusting the difficulty of each problem based on how the student was performing and interacting with the chatbot.

Physics

  1. Mixing generative AI with physics to create personal items that work in the real world — MIT researchers have created a system called PhysiOpt that combines generative AI with physics simulations so 3D designs like cups, keyholders, and bookends are both creative and structurally sound. The tool can turn prompts or images into realistic objects that are more likely to work in the real world, while also speeding up the design process.

    Have you ever had an idea for something that looked cool, but wouldn’t work well in practice? When it comes to designing things like decor and personal accessories, generative artificial intelligence (genAI) models can relate. They can produce creative and elaborate 3D designs, but when you try to fabricate such blueprints into real-world objects, they usually don’t sustain everyday use.

    The underlying problem is that genAI models often lack an understanding of physics.

    While tools like Microsoft’s TRELLIS system can create a 3D model from a text prompt or image, its design for a chair, for example, may be unstable, or have disconnected parts.

    The model doesn’t fully understand what your intended object is designed to do, so even if your seat can be 3D printed, it would likely fall apart under the force of someone sitting down.

    In an attempt to make these designs work in the real world, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are giving generative AI models a reality check.

    Their “PhysiOpt” system augments these tools with physics simulations, making blueprints for personal items such as cups, keyholders, and bookends work as intended when they’re 3D printed.

    It rapidly tests if the structure of your 3D model is viable, gently modifying smaller shapes while ensuring the overall appearance and function of the design is preserved.

    You can simply type what you want to create and what it’ll be used for into PhysiOpt, or upload an image to the system’s user interface, and in roughly half a minute, you’ll get a realistic 3D object to fabricate.

    For example, CSAIL researchers prompted it to generate a “flamingo-shaped glass for drinking,” which they 3D printed into a drinking glass with a handle and base resembling the tropical bird’s leg.

  2. 3 Questions: On the future of AI and the mathematical and physical sciences — MIT’s Jesse Thaler says AI and the mathematical and physical sciences can strengthen each other: science helps build better AI, and AI helps scientists make discoveries faster. The piece highlights the need for coordinated investment in computing, data, training, and interdisciplinary researchers who can work across both fields.

    Curiosity-driven research has long sparked technological transformations. A century ago, curiosity about atoms led to quantum mechanics, and eventually the transistor at the heart of modern computing. Conversely, the steam engine was a practical breakthrough, but it took fundamental research in thermodynamics to fully harness its power.

    Today, artificial intelligence and science find themselves at a similar inflection point. The current AI revolution has been fueled by decades of research in the mathematical and physical sciences (MPS), which provided the challenging problems, datasets, and insights that made modern AI possible.

    The 2024 Nobel Prizes in physics and chemistry, recognizing foundational AI methods rooted in physics and AI applications for protein design, made this connection impossible to miss.

    In 2025, MIT hosted a Workshop on the Future of AI+MPS, funded by the National Science Foundation with support from the MIT School of Science and the MIT departments of Physics, Chemistry, and Mathematics.

    The workshop brought together leading AI and science researchers to chart how the MPS domains can best capitalize on — and contribute to — the future of AI.

    Now a white paper, with recommendations for funding agencies, institutions, and researchers, has been published in Machine Learning: Science and Technology.

    In this interview, Jesse Thaler, MIT professor of physics and chair of the workshop, describes key themes and how MIT is positioning itself to lead in AI and science.

Computer Science

  1. How 5 Colleges Are Approaching AI — Colleges are taking very different approaches to AI, from required first-year AI literacy at Agnes Scott and embedded AI skills at DeVry to liberal arts-focused programming at Richmond and library-led experimentation at Bryn Mawr. Cornell is also teaching critical thinking explicitly, showing how schools are trying to help students use AI responsibly while building durable academic skills.

    Colleges nationwide are taking varied approaches to integrating AI even as they face pushback from students and faculty.

    Artificial intelligence is rapidly reshaping conversations at colleges and universities nationwide. Institutions are rolling out new courses, majors and microcredentials about AI while launching campuswide initiatives to integrate the tools into teaching and learning.

    But those efforts have also sparked pushback from faculty and students as university leaders make decisions in largely uncharted territory.

    Recent survey data from Packback underscores both the opportunity and the concern. In one survey of nearly 700 college students, about 5 percent said they frequently use AI to generate full assignments—a rate comparable to pre-AI forms of contract cheating. Many students cited time constraints, lack of understanding or low interest as reasons for turning to AI shortcuts.

    Other research suggests students are using AI for more than academics. A national survey from Surgo Health, in partnership with Young Futures and The Jed Foundation (JED), found that some young people are turning to generative AI for emotional support—particularly when their needs aren’t being met offline. While short-term relief was common, outcomes were less consistent when AI replaced, rather than supplemented, existing support systems.

    Against this backdrop, colleges are taking a variety of approaches to integrating AI. Here’s how five institutions are putting their strategies into practice.

  2. How the Education Department will prioritize AI in awarding grants — The Education Department has finalized new grant priorities that give more weight to AI projects that expand AI literacy, support ethical use, and improve student outcomes. The priorities also favor uses such as teacher training, personalized learning, special education support, and reducing administrative tasks, though experts say schools still need clearer guidance and dedicated funding.

    Dive Brief:

    The U.S. Department of Education is continuing to push for artificial intelligence use in classrooms through newly finalized priorities and definitions for districts and colleges applying for any of the agency’s discretionary grant programs.

    The department’s final rule, issued Monday, said it will prioritize applications for projects that aim to expand the understanding of AI or the appropriate and ethical use of AI in education.

    Within those parameters, proposals that call for integrating AI literacy skills into teaching and learning practices that improve student outcomes will be given more weight, according to the rule.

    Dive Insight:

    Under the new rule, which takes effect May 13, other AI grant priorities for K-12 include proposals to:

    Expand age-appropriate AI and computer science education offerings in schools.

    Embed AI and computer science lessons into teacher preparation programs.

    Provide professional development for educators to integrate AI into their subject areas.

    Offer dual-enrollment credit opportunities for high schoolers to earn college credits or industry credentials in AI.

    Use AI to support K-12 services, including early intervention and special education, for students with disabilities and their families.

  3. ‘First thing I’ve written in 3 years’: Students’ AI habits prompt teacher training, lesson design — The article says teachers need professional development and careful lesson design to make AI useful in class without letting students use it to avoid thinking and writing. Two teachers describe using ChatGPT as a learning tool while adding guardrails like handwritten work, process-based grading, and in-class discussions to support critical thinking.

    Dive Brief:

    Successfully infusing artificial intelligence into the classroom means boosting students’ AI literacy without using the tech to offload their thinking. But that requires teachers first getting up to speed on AI through professional development and being intentional about how they design lessons and assignments, according to a pair of teachers who have used ChatGPT in their curriculum.

    “Invest in your own education,” said Coral Riley, an AP computer science teacher at Pine Lake Preparatory in Mooresville, North Carolina. Riley, who has participated in a statewide effort to develop guidelines for educators and who has invested hundreds of hours in learning about AI. "Advocate for asking teachers to get time to train. It does require time."

    Casey Cuny, who teaches sophomore honors English and a senior mythology and folklore elective at Valencia High School in California — where he was the state’s 2024 Teacher of the Year — agrees with Riley’s assessment. “Some teachers are a little head-in-the-sand about it, not even realizing how much kids are using it,” he said.

    Dive Insight:

    “I tell teachers, ‘Anything you send home, you have to assume is being AI’d,’” Cuny said. “When teachers do gain AI literacy and learn ChatGPT, they realize, ‘Wow, I need to pay more attention to this.’”

    Cuny says he stopped giving homework “years ago” due to research that recommended against it even before AI. Recently, he was reminded why: A senior kept badgering him about why he couldn’t take an in-class writing assignment home — and then, stymied in that request, admitted, “Low-key, Mr. Cuny, this is the first thing I’ve written in three years.”

    When Cuny expressed surprise, the student added, “Yeah, I just put it into ChatGPT.”

    Riley has been teaching AI and machine learning as part of her course for the past couple of years. She started having students write and test AI models, talking to them about the ethics of its use and how to analyze potential gaps and large language model hallucinations.


Sources

  1. Bloom’s Taxonomy Needs an Update for the AI Age (Opinion) – Education Week
  2. How Teens and Young People Use AI Tools for Learning and Mental Health Support – Education Week
  3. How math logic puzzles can get students ‘proof-ready’ – K-12 Dive
  4. The quest to build a better AI tutor – The Hechinger Report
  5. Mixing generative AI with physics to create personal items that work in the real world – MIT News
  6. 3 Questions: On the future of AI and the mathematical and physical sciences – MIT News
  7. How 5 Colleges Are Approaching AI – Inside Higher Ed
  8. How the Education Department will prioritize AI in awarding grants – K-12 Dive
  9. ‘First thing I’ve written in 3 years’: Students’ AI habits prompt teacher training, lesson design – K-12 Dive