GPT-5

GPT-5.3 Instant Arrives: Smarter Web Answers, Fewer AI Mistakes for ChatGPT

eSpeaks’ Corey Noles talks with Rob Israch, President of Tipalti, about what it means to lead with Global-First Finance and how companies can build scalable, compliant operations in an increasingly ...

Show HN: How I built a resume editor using AI with zero web dev experience

Hi,I have recently been applying for summer internships and got frustrated when tailoring my resumes in Word. I started learning Python last autumn, but had absolutely zero experience with web development or deploying something to the front/backend. I wanted to experiment with the new coding agents to build a resume editor that would make my application process less painful.Here it is: www.tailortojob.appHow I built it: A friend helped me set up the initial infrastructure because I struggle

I'm rebuilding Markdown editing in VS Code (would love feedback)

I’ve recently started coding again after a long time, and this is one of the first tools I’m building. It’s an AI-assisted markdown editor designed to live inside VS Code and reduce the friction of writing and editing documentation.Why? I wanted a markdown editor which did breaks and simple bullets in markdown tablesVisual Studio Marketplace: https://marketplace.visualstudio.com/items?itemName=kamransethi.gpt-ai-markdown-editorGitHub: https://github.com/kamransethi&

I'm 11 and trained a custom MoE LLM for $1

# I'm 11 years old and I trained my own LLM from scratch. 50 people downloaded it in 24 hours.Hey r/LocalLLaMA,I'm Arthur, I'm 11 years old, and I just released *Wind Arc 1.6* — a custom architecture LLM I built and trained myself.## What it isWind Arc 1.6 is a 3.6B parameter model with a custom architecture I designed:- *Mixture of Experts FFN* — 4 routed experts + 1 shared expert per layer (replaces standard MLP) - *YaRN RoPE* — extends context from 8k → 32k tokens - *Hybri

What would you do if you have AI software that may be transformers alternative?

Assuming your software meets some of these:* Better than potential successors of SOTA transformers: Mamba, Hyenna, RWKV, xLSTM; * Gemini/claude estimate potential ip values in the millions; * All implemented in C (ok this may be a minus) but also ported to C# and F# - so no python nor rust; * humble code size: 10-15k lines of code but i mean come on gpt-1 was under 1000 lines...in python.In short:the problem: transformers are slow and could be smarter; solution: you have fast and smart al

Midjourney debuts feature for generating consistent characters across multiple gen AI images

The popular AI image generating service Midjourney has deployed one of its most oft-requested features: the ability to recreate characters consistently across new images. This has been a major hurdle ...

Midjourney 8 Launches : Adds In-Image Text, HD Mode & Faster Image Generation

Midjourney 8 adds in-image text, HD mode, and personalization profiles, but long captions still render with occasional errors ...

Show HN: KatmerCode – Claude Code in Obsidian with academic research skills

I built an Obsidian plugin for my wife (academic researcher). It embeds Claude Code as a sidebar chat using the Agent SDK, and ships with 7 slash-command skills: - /peer-review — 8-criteria manuscript evaluation with radar chart - /cite-verify — checks references against CrossRef, Semantic Scholar, OpenAlex

Show HN: Neurotrace – The extension I built but never used

I’ve been working in ML software for 4 years, and I quickly ran into a recurring problem: I kept asking myself, "Why did I write this function this way?" or "Why is this block here?"I tried to organize my thoughts with Obsidian and other note-taking apps, but let’s be honest, documenting for yourself feels like a chore. Documentation always feels like it's meant for "someone else."So, I decided to build a VS Code extension to save my reasoning and contextual me

Skills are quietly becoming the unit of agent knowledge

In the last few months agent skills went from a niche Claude Code feature to something every major runtime supports. Anthropic has an official skills repo. OpenAI shipped skills in Codex with a built-in skill-creator. Karpathy talks about "everything is skill issue" and describes writing skills as curricula for agents [1]. The format is converging: a folder with a SKILL.md, optional scripts, optional reference files.What changed is that the models got good enough to follow written inst

Pentagon finds another 'problem' with Anthropic and this one is linked to China

The Pentagon has raised national security concerns regarding Anthropic's hiring of foreign workers, particularly from China.

Pentagon's Anthropic bashing rekindles Silicon Valley's resistance to war

Technology companies are backing Anthropic after it sued the Trump administration over its designation as a supply chain risk.

Pentagon Says Anthropic’s AI Safety Limits Make It An ‘Unacceptable’ Wartime Risk

In a 40-page court filing, the U.S. government argued Anthropic’s refusal to permit “all lawful uses” of Claude made the company too risky for national security systems.

Anthropic Is Worth $380 Billion: This Little-Known ETF Could Let You Own a Piece Before It IPOs

Anthropic competes heavily with OpenAI in the artificial intelligence (AI) landscape. Several leading AI developers including Nvidia, Microsoft, Alphabet, and Amazon are investors in Anthropic.

Anthropic just shipped an OpenClaw killer called Claude Code Channels, letting you message it over Telegram and Discord

The consensus among early adopters is that Anthropic has successfully internalized the most desirable features of the open-source movement—multi-channel support and long-term memory ...

OpenAI unveils small models GPT‑5.4 mini and nano

OpenAI (OPENAI) has launched GPT‑5.4 mini and nano, its most capable small models yet, according to the ChatGPT maker. The company said GPT‑5.4 mini significantly improves over GPT‑5 mini across ...

The new model offers performance improvements in reasoning, multimodal understanding and more.

OpenAI's new GPT-5.4 mini model offers performance improvements in reasoning, multimodal understanding and more.

OpenAI announces GPT 5.4 mini and nano: All the details

OpenAI has launched GPT-5.4 mini and nano, focusing on faster performance, lower cost, and improved coding and reasoning capabilities for developers and high-volume AI workloads.

Xiaomi stuns with new MiMo-V2-Pro LLM nearing GPT-5.2, Opus 4.6 performance at a fraction of the cost

MiMo-V2-Pro utilizes a 7:1 hybrid ratio (increased from 5:1 in the Flash version) to manage its massive 1M-token context window.

OpenAI Launches GPT-5.4 Mini & Nano to Power Faster, Lightweight AI

OpenAI unveils GPT-5.4 Mini and Nano, lightweight AI models built for faster performance, real-time apps, and lower costs, now available via ChatGPT and API.