0:00
/
0:00

Canvas, Meet Comet: The Browser That Writes (and Takes) Your Quizzes

Watch AI shave hours off lesson prep—then threaten your assessments.

Note: The Canvas example starts at 14:55 in the video.

Like many educators who explore the forefront of educational technology, I approach testing new AI tools with equal parts enthusiasm and careful skepticism. So when Perplexity launched Comet, their new AI-powered web browser, naturally, I felt compelled to see exactly what it could do—not just because I’m an educator and AI researcher, but because part of my job is anticipating how tools like this will reshape our classrooms and how we prepare students (and ourselves) for what's coming next.

Spoiler alert: Comet is intriguing, innovative, and still pretty rough around the edges. But it also hints at the imminent future of web-integrated AI—and that future is closer than most educators realize.

What Exactly Is Comet?

Comet is a web browser developed by Perplexity, a company best known for its AI-driven search tool—think less chatbot and more research assistant. Imagine Chrome or Safari, but with one critical difference: it includes an AI Assistant built directly into the browser itself. This assistant doesn't just suggest—it actively takes over your screen, moves your mouse, types, scrolls, and clicks buttons.

Naturally, my curiosity was piqued. Could it solve puzzles, build lesson plans, or ace quizzes in Canvas?

Let’s dive in.

Experiment #1: Solving Wordle

Comet quickly solved Wordle in four attempts, displaying thoughtful reasoning and adjustments. Observing it felt like watching a capable student rapidly talk through their logic.

Verdict: Impressive success.

Experiment #2: Crosswords and Connections

Things got ethically complicated here.

When tackling the New York Times Mini Crossword, Comet didn't reason—it outright cheated, scraping answers from the web and displaying them prominently. Educationally troubling, to put it mildly.

Next, Connections—a game requiring abstract categorization—completely stumped Comet. It clicked aimlessly without success.

Verdict: Failure, with ethical issues.

Experiment #3: Writing a LinkedIn Comment

Comet’s comment on an Atlantic article was bland and generic, clearly produced without actually reading the article itself. It felt exactly like the kind of writing that folks can instantly identify as AI-generated.

Verdict: Disappointing.

Experiment #4: Building a Spotify Playlist

This was surprisingly entertaining. Comet enthusiastically generated a playlist title, selected songs, and even began creating the playlist on Spotify. However, it stumbled, creating two playlists instead of one. Technically completed, but messy.

Verdict: Fun, functional, a bit chaotic.

Experiment #5: Creating a Canva Slide Deck (Newton’s Laws)

Comet struggled badly here. It produced a visually incoherent stack of text boxes lacking design or clarity. If a student submitted this, it would certainly warrant revision.

Verdict: Complete miss.

Experiment #6: Creating a Scatter Plot in Google Sheets

Now this was genuinely helpful. Given data on hours studied vs. test scores, Comet initially worked outside Google Sheets, but once redirected, it flawlessly created a correctly labeled scatter plot. Very useful.

Verdict: Legitimately helpful.

Experiment #7: Creating and Taking a Quiz in Canvas

Canvas, a popular LMS, serves as the digital classroom for countless educators. I created a basic Algebra course and instructed Comet to build a short, multiple-choice quiz.

Comet instantly generated three algebra problems with correct answers. Then, without explicit direction, it navigated Canvas's quiz builder, inputting questions, answers, and settings seamlessly. As someone who’s manually entered countless quizzes into Canvas, this felt revolutionary—a massive timesaver for teachers.

But then curiosity pushed me further. Could Comet also take that quiz?

Switching to student mode, I watched Comet scan questions, reason answers, and accurately select choices—all without human intervention. It behaved exactly like a human student navigating Canvas.

Educators have been discussing the dangers of AI being used in unethical ways for the last two years. What makes Comet different isn't its ability to solve quiz problems. The unique part is its ability to take control of your browser, navigate your Canvas coursework on its own, solve all of the problems in context without the need to copy and paste them into a separate chatbot, and then directly type solutions into Canvas itself. This really is new, and in my mind marks the beginning of the end for AI detection of any sort (not that AI detectors have ever worked).

This experiment was undeniably fascinating, but it also marked a subtle, critical shift in AI capabilities within education.

Verdict: Technically impressive; educationally sobering.

Key Takeaways

While Comet's agentic abilities aren't quite what I'd hoped, the browser is at least as good as Chrome and Safari. I recommend giving Comet a try because it is likely to improve, and it doesn't leave you any worse off. It vividly demonstrates what's just around the corner:

✅ Teachers can rapidly digitize content, streamline admin tasks, and generate visual data.

❌ Students can effortlessly circumvent assessments, access web-scraped answers, and produce superficially original content.

🤔 It's equally powerful and problematic.

As educators and school leaders, this isn’t just a passing curiosity. It’s a clear signal that we must proactively reconsider how we design learning experiences, define academic integrity, and guide students through an AI-rich landscape.

Thanks for going on this journey with me!

Stay curious. Stay human. Teach boldly.

—Michael

Discussion about this video

User's avatar