Show HN: MCP server for Blender that builds 3D scenes via natural language

blender-mcp-psi.vercel.app

151 points by prono 2 days ago

Hi HN!

I built a custom MCP (Model Context Protocol) server that connects Blender to LLMs like ChatGPT, Claude, and any other llm supporting tool calling and mcps, enabling the AI to understand and control 3D scenes using natural language.

You can describe an entire environment like:

> “Create a small village with 5 huts arranged around a central bonfire, add a river flowing on the left, place a wooden bridge across it, and scatter trees randomly.”

And the system parses that, reasons about the scene, and builds it inside Blender — no manual modeling or scripting needed.

What it can do: - Generate multi-object scenes like villages, landscapes, from a single prompt - Understand spatial relations — e.g., “place the bridge over the river” or “add trees behind the huts” - Create camera animations and lighting setups: “orbit around the scene at sunset lighting” - Respond to iterative changes like: “replace all huts with stone houses” or “make the river narrower” - Maintain object hierarchy and labels for later editing

Tech Stack: - Blender Python scripting - Node.js server running MCP - LLM backend (OpenAI / Claude, easily swappable)

Demo: https://blender-mcp-psi.vercel.app/

GitHub: https://github.com/pranav-deshmukh/blender-mcp-demo/

Curious to hear thoughts from folks in 3D tooling, AI-assisted design, or dev interface design. Would you find this useful as a Blender plugin? I’m open to expanding it!

Please try it and give it a star on github

vunderba a day ago

Couple things:

1. Your github doesn't have anything in it, it is just a generic MCP server.

2. How does this differ from blender-mcp released by ahujasid several months ago? That one actually does have the complete source and has quite a following.

https://github.com/ahujasid/blender-mcp

https://news.ycombinator.com/item?id=43357112

  • prono a day ago

    It is indeed a mcp server, but I have added some things that makes of different from being generic, it works smoothly, you can see from code.

    And I am working on it, it is new and I am adding other this to it like generating 3js scenes, adding free blender asset apis, etc. Happy if anyone else wants to contribute

    • koolala a day ago

      Why star it on github if it doesn't include the code to run it?

      • keysdev a day ago

        Cause github has become a advertising platform for dev. How did this get to front page without code?

      • prono 19 hours ago

        What are you saying, it has a code brother, please visit once. And yes, if you want to contribute, I am more than happy

gorbachev 2 days ago

The fade in effect when scrolling down is quite distracting, and makes reading the web page slower, because I have to wait for the text to appear. Yes, I have a fast computer.

  • halflife a day ago

    It is also very choppy on my iPhone 16, not sure why.

    Edit - I tried watching the demo, and it seems that on my phone the site is not usable, I can’t play the video, clicking on play does nothing and the page keeps scrolling and jumping

    • prono a day ago

      Fixing it asap

      • hgomersall a day ago

        The live demo video is broken in mobile Firefox. It displays, but is annoyingly cropped (and differently depending on landscape or portrait).

    • desdenova a day ago

      The site layout is completely broken.

      Probably vibecoded slop.

      • prono a day ago

        Hi, not at all vibecoded. I will fix it asap, I built it in hurry, sorry for the issue

      • brookst a day ago

        That's fairly rude.

        • llbbdd a day ago

          HN has been happily very rude about anything AI related the last year, even in cases here where it's hardly relevant or appropriate. It's depressing and I used to expect a lot better.

          • hombre_fatal a day ago

            It takes more work to build a janky site than just no frills html/css unless you vibecoded it or copy and pasted a crappy template.

            I think we should be allowed to push back against sloppy work (which is different from beginner work) instead of ingratiating it with a smile.

            We have the rest of you to baby them over adding the worst css transitions I’ve ever seen, something they deliberately swerved into.

            They are accused of vibe coding it only through charity because it’s hard to imagine they did it themselves and went “yup that’s exactly what I wanted after spending that extra time adding it.” Whether it’s vibe coded or not isn’t really the point.

            • llbbdd a day ago

              Grading on perceived effort is not a rubric destined to last. You cannot detect sloppy work from beginner work without context, and in any case a lot of beginner work these days (and to some degree for the rest of time!) is going to include LLMs or AI.

              Is HN only for advertising startups these days? If this post had nothing to do with AI maybe the response would have included some real genuine criticism and feedback, with the assumption baked-in that a beginner was being coached.

              To your last point, then downvote it if it's bad. You're right and I agree precisely that it being vibe-coded wasn't the point - but it was brought up regardless. If the result is bad the feedback is still the same. If the "problem" is just that they used tools you don't agree with using, then that's not feedback on the result.

  • johnisgood a day ago

    I do not think it has much to do with how fast your computer is, it is probably timed, e.g. from the CSS: "transition-duration: 0.3s". It is quite annoying.

    Almost akin to:

    - "How many CSS effects do you want?"

    - "Yes".

    :P

    At any rate, the project is pretty cool. Everything is just one prompt away now (not really, but still!).

    • prono a day ago

      Thanks for the feedback brother, I will surely improve the website

      • johnisgood a day ago

        The only issue I have is not being able to read the text right away, but perhaps making the animation faster might work?

  • tacker2000 a day ago

    Why these web page animations are still a thing in 2025, i will never understand…

deng 2 days ago

Hi, quick feedback: the demo is extremely short, so I can't really say much. Please generate more complicated scenes and, most importantly, inspect the wireframe. From what I could glance from the demo, the generated models are tri-based instead of quads, which would be a showstopper for me.

  • thegeomaster a day ago

    Just curious: why do you prefer/have a requirement of quad-based meshes?

    • deng a day ago

      Because traditionally, Blender modeling works best on a clean quad-based mesh. Just look at any modeling tutorial for Blender and one of the first things you learn is to always keep a clean, quad-based topology, and avoid triangles and n-gons as much as possible, as it will make further work on the model more painful, if not impossible. That starts with simple stuff like doing a loop cut to things like uv-unwrapping and using the sculpting tools. It's also better for subdivision surface modeling. You can of course use tri-based models, but if you want to refine them manually, it's often a pain. Usually, for me it's pretty much a "take as-is or leave it" situation for tri-based meshes, and since I see these AI-created models more as a starting point rather than the finished product, having a clean quad-based topology would be very important for me.

      • nextaccountic a day ago

        Is this true even if you do only or mostly sculpting?

        • LtdJorge a day ago

          No. But for animation meshes, it's the norm to use only quads. Mainly because of topology/retopology issues.

    • ddtaylor a day ago

      Sometimes texture artists like this a lot more.

      • deng 12 hours ago

        Yes, because uv-unwrapping is much more predictable with quads, and you can place seams along edge loops. I'm by no means an expert here, maybe there are tools which make this similarly easy with non-quad topology, but at least from what I've learnt, the clean grids you get form quad-meshes are simply much easier to deal with when doing texturing.

  • prono a day ago

    On it, thank you for feedback

qarthandyc a day ago

The fade-in effect is really distracting, and so poorly done. It takes the elements reaching almost 50% of the screen height before becoming readable.

This is so sad to see animation hurting a good product.

bsenftner 2 days ago

An MPC server is not necessary, one can just API call LLM services directly from within Blender, and they already know Blender - the LLMs know it very well, it being open source and a gargantuan amount of data about it online in the form of tutorials and so on - all in foundation model training data.

zakki a day ago

Great works. In your "How to Setup" the cloned project is "blender-mcp" but the directory is "bleder-mcp-demo".

I don't have Claude and no experience with MCP. How to use it with other tools such as LMStudio, ollama, etc?

  • prono a day ago

    fixing it, its actually blender-mcp only, I changed the repo name form blende-mcp-demo to blender-mcp.

    And you can use free tier claude desktop or other open source llms

abrookewood a day ago

Congrats and releasing something. I'm not a blender user, but I think the demo is pretty cool. Kind of crazy what MCP is allowing LLMs to do.

homarp 2 days ago

how does it compare to the existing https://blender-mcp.com/ ?

  • chmod775 2 days ago

    Slightly strange how both use the same example of a house with some trees.

    • prono a day ago

      Will use better example, thank you for suggestion

  • N_Lens 2 days ago

    Better in every way since this is posted to HN!

    • boguscoder a day ago

      That one was discussed here too, many times

rsp1984 a day ago

I apologize for this extremely dumb question, but how is this a "server"? As far as I'm aware Blender is a local app. It can run without an internet connection. If an LLM wants to call into it, it needs to call its local python API.

Is this just unlucky naming or am I missing a critical piece?

  • tracerbulletx a day ago

    MCP is a spec that is attempting to standardize a communication pattern for registering and calling tools from an llm. Part of the spec is a server that exposes specific JSON-RPC end points with a registry of the available tools, resources, and templates, and a way of executing them. That's the server, in this case the server acts as the interface into Blender.

  • ww520 a day ago

    The pipeline for LLM to MCP and to the app looks like,

      LLM -> chat app -> MCP client -> MCP server -> specific app (Blender)
    
    The chat app doesn’t know how to talk to Blender. It knows about MCP and links in a client. Blender exposes its functionality via a MCP server. MCP connects the two.
  • camdenreslink a day ago

    A server-client architecture can run on a single computer. You just need one piece of code to act as the "server" and one to act as the "client". Technically you don't even necessarily have to involve the networking stack, you can just communicate between processes.

fode a day ago

This is an awesome use of MCP. Thank you!

  • prono a day ago

    Thanks a lot brother

redrove 2 days ago

Does anyone know of a way to create custom 3D print designs with LLMs? Is there a bespoke project or service somewhere?

  • r2_pilot a day ago

    I have successfully(if inefficiently, but faster than if I did it on my own) used Claude with OpenSCAD to make 3d printed products.

dcreater a day ago

Your vibe coded website has a lot of issues on mobile

franze a day ago

Screenshots would be nice.

ur-whale a day ago

Is there a feedback loop ?

As in:

External Prompt -> Claude -> MCP -> Blender -> Cycles -> .exr -> show Claude how good its work actually is -> Correct -> New prompt -> ... Rinse and repeat until result actually looks realistic.

mattigames a day ago

I'm tired of the half-way-there automations, I want an MCP that can replace the person that would need to use this.

  • bravesoul2 a day ago

    Taken to the nth degree you want an MCP server that makes you a feature length animation or invents a new device and ships it to you?

    • mattigames a day ago

      Of course not, I would just ask for a MCP that watches the generated movie so I can use my time for more important matters, I just want the system to work by itself entirely, we could have these full consumerism silos and we just enjoy being called it's gods, but perhaps we could automate such egocentrism too.

      • r2_pilot a day ago

        > Of course not, I would just ask for a MCP that watches the generated movie so I can use my time for more important matters

        Well, now I know why "they" bother to digitally simulate my existence, and why movies are so terrible.

  • prono a day ago

    Ha ha, will reach there eventually

  • willsmith72 a day ago

    sounds like my CEO. "we don't need engineers, we have ai"

    now who runs the AI?

    ...

    • mattigames a day ago

      Obviously you set a Batlle royale style competition with all the engineers where the only one who survives get to be in charge of all AI.

contingencies a day ago

I managed to do something like this directly in WebGL via threejs in Windsurf 2 weeks ago, you can see the resulting animation over here: https://infinite-food.com/ Also did an SVG animation and a globe with geopoints. So much easier than by hand...