<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Mini PC Archives - Prsm Studio</title>
	<atom:link href="https://prsm-studio.com/en/tag/mini-pc-en/feed/" rel="self" type="application/rss+xml" />
	<link>https://prsm-studio.com/en/tag/mini-pc-en/</link>
	<description>automation · homeserver · side projects · game · gadgets · play</description>
	<lastBuildDate>Mon, 09 Mar 2026 07:57:07 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Even a Code-Illiterate Built It! Home Server Journey (4) — Running AI Locally with Ollama</title>
		<link>https://prsm-studio.com/en/code-illiterate-home-server-build-4-ollama-local-ai-en/</link>
					<comments>https://prsm-studio.com/en/code-illiterate-home-server-build-4-ollama-local-ai-en/#respond</comments>
		
		<dc:creator><![CDATA[Toaster]]></dc:creator>
		<pubDate>Mon, 09 Mar 2026 03:15:56 +0000</pubDate>
				<category><![CDATA[Computer Play]]></category>
		<category><![CDATA[Home Server]]></category>
		<category><![CDATA[LLM]]></category>
		<category><![CDATA[local AI]]></category>
		<category><![CDATA[Mini PC]]></category>
		<category><![CDATA[Ollama]]></category>
		<category><![CDATA[Open WebUI]]></category>
		<category><![CDATA[self-hosted AI]]></category>
		<category><![CDATA[SER9 MAX]]></category>
		<guid isPermaLink="false">https://prsm-studio.com/code-illiterate-home-server-build-4-ollama-local-ai-en/</guid>

					<description><![CDATA[<p>I installed Ollama and Open WebUI on my home server to run free local AI. Real benchmarks from my SER9 MAX mini PC, RAM-based model guide, and honest conclusions.</p>
<p>The post <a href="https://prsm-studio.com/en/code-illiterate-home-server-build-4-ollama-local-ai-en/">Even a Code-Illiterate Built It! Home Server Journey (4) — Running AI Locally with Ollama</a> appeared first on <a href="https://prsm-studio.com/en">Prsm Studio</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Running AI on My Own Server?</h2>
<p>ChatGPT, Gemini, Claude… everyone uses cloud AI. But have you ever thought:</p>
<p><strong>&#8220;If I run AI on my own computer, it&#8217;s free AND my data stays private?&#8221;</strong></p>
<p>That&#8217;s exactly right. Running a local LLM (Large Language Model) means no subscription fees and zero data leaving your machine. Perfect privacy.</p>
<p>But reality is… a bit different. I installed AI on <a href="/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/">my SER9 MAX mini PC from Episode 1</a>, and the honest verdict? <strong>&#8220;It works. But it&#8217;s slow.&#8221;</strong></p>
<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="940" height="627" src="https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530416-1.jpg" alt="DeepSeek AI 인터페이스를 보여주는 MacBook으로 디지털 혁신을 선보입니다." class="wp-image-225" srcset="https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530416-1.jpg 940w, https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530416-1-300x200.jpg 300w, https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530416-1-768x512.jpg 768w" sizes="(max-width: 940px) 100vw, 940px" /><figcaption>Photo by Matheus Bertelli / Pexels</figcaption></figure>
<h2>Ollama — The Local LLM Engine</h2>
<p>Ollama is a tool that lets you run AI models on your own hardware. Sounds complicated? I had AI install it for me. A few terminal commands and done.</p>
<p>Once installed, one command — <code>ollama run qwen3:14b</code> — and the AI starts responding. The model downloads automatically, no configuration needed.</p>
<p>There are dozens of open-source models available: Llama, Qwen, Gemma, Mistral, DeepSeek… all free. Pick whichever fits your needs.</p>
<h2>Open WebUI — ChatGPT Interface in Your Browser</h2>
<p>Chatting in a terminal is honestly uncomfortable. So I installed <strong>Open WebUI</strong> — a program that gives you the exact same ChatGPT-like interface, running entirely on your server.</p>
<p>Again, AI handled the installation. One Docker container and it&#8217;s running.</p>
<p>The best part? <strong>My wife uses it too.</strong> Anyone on the same network can open a browser on their phone or tablet and start chatting. You can create separate accounts, so conversation history stays private for each person. With <a href="/en/code-illiterate-home-server-build-2-tailscale-remote-access-en/">Tailscale from Episode 2</a>, it&#8217;s accessible from anywhere.</p>
<figure class="wp-block-image size-large"><img decoding="async" width="433" height="650" src="https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530413-1.jpg" alt="DeepSeek 애플리케이션이 있는 대화형 AI 인터페이스를 보여주는 노트북 이미지." class="wp-image-226" srcset="https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530413-1.jpg 433w, https://prsm-studio.com/wp-content/uploads/2026/03/stock-30530413-1-200x300.jpg 200w" sizes="(max-width: 433px) 100vw, 433px" /><figcaption>Photo by Matheus Bertelli / Pexels</figcaption></figure>
<h2>Specs vs. Reality — This Is What Matters</h2>
<p>The most important question in local AI is <strong>&#8220;Can my hardware actually handle it?&#8221;</strong> Here are my real-world numbers.</p>
<h3>My Server Specs</h3>
<table>
<tr>
<th>Component</th>
<th>Specification</th>
</tr>
<tr>
<td>CPU</td>
<td>AMD Ryzen 7 255 (8 cores, 16 threads)</td>
</tr>
<tr>
<td>RAM</td>
<td>DDR5 32GB</td>
</tr>
<tr>
<td>GPU</td>
<td>Integrated (AMD Radeon 780M) — <strong>effectively none</strong></td>
</tr>
<tr>
<td>Storage</td>
<td>NVMe SSD 1TB</td>
</tr>
<tr>
<td>OS</td>
<td>Windows 11 + WSL2 (Linux)</td>
</tr>
</table>
<h3>Real Benchmarks (Qwen3 14B Model)</h3>
<table>
<tr>
<th>Metric</th>
<th>Value</th>
</tr>
<tr>
<td>Generation Speed</td>
<td><strong>5.5 tokens/sec</strong></td>
</tr>
<tr>
<td>Simple Question Response</td>
<td>~25 seconds</td>
</tr>
<tr>
<td>RAM Usage</td>
<td>~10GB</td>
</tr>
<tr>
<td>Quantization</td>
<td>Q4_K_M (9.3GB file)</td>
</tr>
</table>
<p>What ChatGPT answers in 1 second takes <strong>my server 25 seconds.</strong> That&#8217;s roughly 5-10x slower in real usage. Watching characters appear one by one is… a patience test.</p>
<h3>Why So Slow?</h3>
<p><strong>No dedicated GPU.</strong> AI inference is optimized for GPU computing, but my mini PC only has integrated graphics. I&#8217;ve confirmed that the AMD 780M iGPU can&#8217;t be used for AI acceleration under WSL2. Everything runs on <strong>CPU only</strong> — hence the speed.</p>
<p>With an NVIDIA GPU? The same model runs <strong>5-10x faster.</strong> An RTX 4060 can push 30+ tokens/second. But you can&#8217;t put a discrete GPU in a mini PC — that&#8217;s desktop or gaming laptop territory.</p>
<h3>RAM Determines Model Size</h3>
<p>The most important spec for local AI is <strong>RAM</strong>. The entire model loads into memory.</p>
<table>
<tr>
<th>RAM</th>
<th>Model Size</th>
<th>Quality</th>
</tr>
<tr>
<td>8GB</td>
<td>7B (7 billion parameters)</td>
<td>Basic chat OK, struggles with complexity</td>
</tr>
<tr>
<td>16GB</td>
<td>14B (14 billion parameters)</td>
<td>Decent conversation, handles general tasks</td>
</tr>
<tr>
<td>32GB</td>
<td>14B + headroom / can try 30B</td>
<td>Comfortable 14B + other services running</td>
</tr>
<tr>
<td>64GB+</td>
<td>70B (70 billion parameters)</td>
<td>Approaching ChatGPT quality</td>
</tr>
</table>
<p><strong>7B vs 14B vs 70B — bigger means better.</strong> 7B handles simple chat but frequently hallucinates on complex questions. 14B is the minimum threshold where it feels &#8220;actually usable.&#8221; 70B jumps in quality but needs 40GB+ RAM.</p>
<p>That&#8217;s why I have 32GB. Running a 14B model while also keeping other Docker services (<a href="/en/code-illiterate-home-server-build-3-immich-photo-backup-en/">Immich</a>, WordPress, n8n, etc.) alive requires the headroom.</p>
<figure class="wp-block-image size-large"><img decoding="async" width="867" height="650" src="https://prsm-studio.com/wp-content/uploads/2026/03/stock-31993524-1.jpg" alt="선명한 노란색 표면의 T-Force Delta RGB DDR5 메모리 모듈." class="wp-image-227" srcset="https://prsm-studio.com/wp-content/uploads/2026/03/stock-31993524-1.jpg 867w, https://prsm-studio.com/wp-content/uploads/2026/03/stock-31993524-1-300x225.jpg 300w, https://prsm-studio.com/wp-content/uploads/2026/03/stock-31993524-1-768x576.jpg 768w" sizes="(max-width: 867px) 100vw, 867px" /><figcaption>Photo by Andrey Matveev / Pexels</figcaption></figure>
<h2>So Is It Worth It?</h2>
<p>Here&#8217;s my honest summary:</p>
<p><strong>Worth it for:</strong></p>
<ul>
<li>Simple conversations, translation, summarization — slow but delivers results</li>
<li>Privacy-sensitive content — analyzing confidential work documents</li>
<li>Offline use — on a plane, in areas with no internet</li>
<li>Connecting AI to other apps — unlimited API calls, zero cost</li>
</ul>
<p><strong>Not worth it for:</strong></p>
<ul>
<li>Coding, complex analysis — cloud AI is overwhelmingly better</li>
<li>When you need fast responses — if you can&#8217;t wait 25 seconds</li>
<li>When you need current information — local models don&#8217;t know anything after their training date</li>
</ul>
<p>The core value of local AI is <strong>&#8220;free&#8221;</strong> and <strong>&#8220;privacy.&#8221;</strong> If you&#8217;re expecting performance, you&#8217;ll be disappointed. But if those two things matter to you, it&#8217;s absolutely worthwhile.</p>
<h2>Next Episode Preview</h2>
<p>So far we&#8217;ve covered building the server, remote access, photo backup, and local AI. Next up is the piece that ties everything together — <strong>an AI agent and Telegram bot.</strong> Send a message on Telegram, and AI handles the rest. Building your own digital assistant.</p>
<p><strong>EP.5 — AI Agent + Telegram: Putting a Secretary on Your Server.</strong> Stay tuned.</p>
<p><a class="a2a_button_facebook" href="https://www.addtoany.com/add_to/facebook?linkurl=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-4-ollama-local-ai-en%2F&amp;linkname=Even%20a%20Code-Illiterate%20Built%20It%21%20Home%20Server%20Journey%20%284%29%20%E2%80%94%20Running%20AI%20Locally%20with%20Ollama" title="Facebook" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_mastodon" href="https://www.addtoany.com/add_to/mastodon?linkurl=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-4-ollama-local-ai-en%2F&amp;linkname=Even%20a%20Code-Illiterate%20Built%20It%21%20Home%20Server%20Journey%20%284%29%20%E2%80%94%20Running%20AI%20Locally%20with%20Ollama" title="Mastodon" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_email" href="https://www.addtoany.com/add_to/email?linkurl=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-4-ollama-local-ai-en%2F&amp;linkname=Even%20a%20Code-Illiterate%20Built%20It%21%20Home%20Server%20Journey%20%284%29%20%E2%80%94%20Running%20AI%20Locally%20with%20Ollama" title="Email" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-4-ollama-local-ai-en%2F&#038;title=Even%20a%20Code-Illiterate%20Built%20It%21%20Home%20Server%20Journey%20%284%29%20%E2%80%94%20Running%20AI%20Locally%20with%20Ollama" data-a2a-url="https://prsm-studio.com/en/code-illiterate-home-server-build-4-ollama-local-ai-en/" data-a2a-title="Even a Code-Illiterate Built It! Home Server Journey (4) — Running AI Locally with Ollama"></a></p><p>The post <a href="https://prsm-studio.com/en/code-illiterate-home-server-build-4-ollama-local-ai-en/">Even a Code-Illiterate Built It! Home Server Journey (4) — Running AI Locally with Ollama</a> appeared first on <a href="https://prsm-studio.com/en">Prsm Studio</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://prsm-studio.com/en/code-illiterate-home-server-build-4-ollama-local-ai-en/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[Computer Play] Even a Code-Illiterate Built It! My Home Server Journey (1) &#8211; Starting with SER9 MAX, Windows 11, WSL2, and Docker 💻🚀 (feat. Claude &#038; Claude Code)</title>
		<link>https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/</link>
					<comments>https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#respond</comments>
		
		<dc:creator><![CDATA[Toaster]]></dc:creator>
		<pubDate>Wed, 04 Mar 2026 05:50:49 +0000</pubDate>
				<category><![CDATA[Computer Play]]></category>
		<category><![CDATA[Home Server]]></category>
		<category><![CDATA[Claude]]></category>
		<category><![CDATA[Claude Code]]></category>
		<category><![CDATA[Code-Illiterate]]></category>
		<category><![CDATA[Docker]]></category>
		<category><![CDATA[Mini PC]]></category>
		<category><![CDATA[SER9 MAX]]></category>
		<category><![CDATA[Windows 11]]></category>
		<category><![CDATA[WSL2]]></category>
		<guid isPermaLink="false">http://wordpress:80/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/</guid>

					<description><![CDATA[<p>Even a code-illiterate Toaster did it! The first installment of my home server building series using mini PC SER9 MAX, Windows 11, WSL2, and Docker. Introducing an exciting journey started with the help of Claude and Claude Code.</p>
<p>The post <a href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/">[Computer Play] Even a Code-Illiterate Built It! My Home Server Journey (1) &#8211; Starting with SER9 MAX, Windows 11, WSL2, and Docker 💻🚀 (feat. Claude &#038; Claude Code)</a> appeared first on <a href="https://prsm-studio.com/en">Prsm Studio</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div id="ez-toc-container" class="ez-toc-v2_0_81 counter-hierarchy ez-toc-counter ez-toc-grey ez-toc-container-direction">
<div class="ez-toc-title-container">
<p class="ez-toc-title" style="cursor:inherit">Table of Contents</p>
<p><span class="ez-toc-title-toggle"><a href="#" class="ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle" aria-label="Toggle Table of Content"><span class="ez-toc-js-icon-con"><span class=""><span class="eztoc-hide" style="display:none;">Toggle</span><span class="ez-toc-icon-toggle-span"><svg style="fill: #999;color:#999" xmlns="http://www.w3.org/2000/svg" class="list-377408" width="20px" height="20px" viewBox="0 0 24 24" fill="none"><path d="M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z" fill="currentColor"></path></svg><svg style="fill: #999;color:#999" class="arrow-unsorted-368013" xmlns="http://www.w3.org/2000/svg" width="10px" height="10px" viewBox="0 0 24 24" version="1.2" baseProfile="tiny"><path d="M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z"/></svg></span></span></span></a></span></div>
<nav>
<ul class='ez-toc-list ez-toc-list-level-1 ' >
<li class='ez-toc-page-1 ez-toc-heading-level-3'><a class="ez-toc-link ez-toc-heading-1" href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#Computer_Play_Even_a_Code-Illiterate_Built_It_My_Home_Server_Journey_1_%E2%80%93_Starting_with_SER9_MAX_Windows_11_WSL2_and_Docker_%F0%9F%92%BB%F0%9F%9A%80_feat_Claude_Claude_Code" >[Computer Play] Even a Code-Illiterate Built It! My Home Server Journey (1) &#8211; Starting with SER9 MAX, Windows 11, WSL2, and Docker <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f4bb.png" alt="💻" class="wp-smiley" style="height: 1em; max-height: 1em;" /><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f680.png" alt="🚀" class="wp-smiley" style="height: 1em; max-height: 1em;" /> (feat. Claude &#038; Claude Code)</a>
<ul class='ez-toc-list-level-4' >
<li class='ez-toc-heading-level-4'><a class="ez-toc-link ez-toc-heading-2" href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#1_Why_Did_I_Want_to_Build_a_Home_Server_And_Why_SER9_MAX_%E2%9C%A8" >1. Why Did I Want to Build a Home Server? And Why SER9 MAX? <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2728.png" alt="✨" class="wp-smiley" style="height: 1em; max-height: 1em;" /></a></li>
<li class='ez-toc-page-1 ez-toc-heading-level-4'><a class="ez-toc-link ez-toc-heading-3" href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#2_Is_Windows_11_Suitable_as_a_Home_Server_OS_%F0%9F%A4%94" >2. Is Windows 11 Suitable as a Home Server OS? <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f914.png" alt="🤔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></a></li>
<li class='ez-toc-page-1 ez-toc-heading-level-4'><a class="ez-toc-link ez-toc-heading-4" href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#3_A_Small_Linux_World_Within_Windows_My_WSL2_Installation_Journey_%F0%9F%90%A7" >3. A Small Linux World Within Windows: My WSL2 Installation Journey <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f427.png" alt="🐧" class="wp-smiley" style="height: 1em; max-height: 1em;" /></a></li>
<li class='ez-toc-page-1 ez-toc-heading-level-4'><a class="ez-toc-link ez-toc-heading-5" href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#4_The_Magic_of_Containers_Docker_Desktop_Installation_and_Integration_%F0%9F%90%B3" >4. The Magic of Containers: Docker Desktop Installation and Integration <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f433.png" alt="🐳" class="wp-smiley" style="height: 1em; max-height: 1em;" /></a></li>
<li class='ez-toc-page-1 ez-toc-heading-level-4'><a class="ez-toc-link ez-toc-heading-6" href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/#5_Conclusion_Taking_the_First_Step_in_Building_My_Home_Server_%F0%9F%92%96" >5. Conclusion: Taking the First Step in Building My Home Server <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f496.png" alt="💖" class="wp-smiley" style="height: 1em; max-height: 1em;" /></a></li>
</ul>
</li>
</ul>
</nav>
</div>
<h3><span class="ez-toc-section" id="Computer_Play_Even_a_Code-Illiterate_Built_It_My_Home_Server_Journey_1_%E2%80%93_Starting_with_SER9_MAX_Windows_11_WSL2_and_Docker_%F0%9F%92%BB%F0%9F%9A%80_feat_Claude_Claude_Code"></span><span class="ez-toc-section" id="Computer_Play_Even_a_Code-Illiterate_Built_It_My_Home_Server_Journey_1_%E2%80%93_Starting_with_SER9_MAX_Windows_11_WSL2_and_Docker_%F0%9F%92%BB%F0%9F%9A%80_feat_Claude_Claude_Code"></span><strong>[Computer Play] Even a Code-Illiterate Built It! My Home Server Journey (1) &#8211; Starting with SER9 MAX, Windows 11, WSL2, and Docker <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f4bb.png" alt="💻" class="wp-smiley" style="height: 1em; max-height: 1em;" /><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f680.png" alt="🚀" class="wp-smiley" style="height: 1em; max-height: 1em;" /> (feat. Claude &#038; Claude Code)</strong><span class="ez-toc-section-end"></span><span class="ez-toc-section-end"></span></h3>
<p>Hello, I&#8217;m <strong>Toaster</strong>! <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f64b-200d-2642-fe0f.png" alt="🙋‍♂️" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Today, I&#8217;d like to share the first story of an exciting project I embarked on: <strong>building my own home server</strong>. To be honest, I&#8217;m completely <strong>illiterate</strong> when it comes to code or computers. Yet, driven by growing costs of cloud services and concerns about my data sovereignty, I decided to create &#8216;my own playground.&#8217; The journey began with a mini PC, the <strong>Beelink SER9 MAX</strong>. A special highlight is that this entire journey started with <strong>Claude</strong>, and the installation process was seamlessly handled by <strong>Claude Code</strong>!</p>
<h4><span class="ez-toc-section" id="1_Why_Did_I_Want_to_Build_a_Home_Server_And_Why_SER9_MAX_%E2%9C%A8"></span><span class="ez-toc-section" id="1_Why_Did_I_Want_to_Build_a_Home_Server_And_Why_SER9_MAX_%E2%9C%A8"></span><strong>1. Why Did I Want to Build a Home Server? And Why SER9 MAX? <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2728.png" alt="✨" class="wp-smiley" style="height: 1em; max-height: 1em;" /></strong><span class="ez-toc-section-end"></span><span class="ez-toc-section-end"></span></h4>
<p>Initially, I used cloud servers. However, as time went on, the monthly costs became a burden, and I felt a vague unease about my precious data being stored somewhere else. So, I decided to &#8216;manage a server directly with my own hands.&#8217; I dreamed of a digital playground operated in my own space, under my own rules. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f3f0.png" alt="🏰" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>
<p>I spent a lot of time considering which hardware to choose for building a home server. After comparing several mini PCs, the <strong>Beelink SER9 MAX</strong> caught my eye. 10 Gigabit Ethernet, dual M.2 NVMe slots, DDR5 memory, and an efficient AMD Ryzen 7 H255 processor! It boasted incredible specs for its small size. I vividly remember the excitement of ordering it from Amazon and waiting for its arrival. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f4e6.png" alt="📦" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Throughout this entire process of exploration and decision-making, <strong>Claude</strong> provided invaluable assistance with various information searches and comparative analyses.</p>
<h4><span class="ez-toc-section" id="2_Is_Windows_11_Suitable_as_a_Home_Server_OS_%F0%9F%A4%94"></span><span class="ez-toc-section" id="2_Is_Windows_11_Suitable_as_a_Home_Server_OS_%F0%9F%A4%94"></span><strong>2. Is Windows 11 Suitable as a Home Server OS? <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f914.png" alt="🤔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></strong><span class="ez-toc-section-end"></span><span class="ez-toc-section-end"></span></h4>
<p>When I received the SER9 MAX, I found that <strong>Windows 11</strong> was pre-installed. Typically, when people think of a home server, Linux often comes to mind, but I&#8217;m familiar with the Windows environment, and installing a new Linux server OS right away seemed cumbersome. So, I decided to use Windows 11 as is.</p>
<p><strong>The advantages were clear.</strong> The familiar UI/UX made initial setup incredibly convenient, and its compatibility with various Windows software was excellent. For purposes like a media server or simple file sharing, it was quite appealing. However, there were also <strong>clear drawbacks.</strong> Compared to Linux-based server operating systems, Windows generally consumes more system resources like CPU and RAM, meaning that 24/7 stable operation requires more attention. The absence of advanced features like Remote Desktop Server and Hyper-V in Windows 11 Home was also a downside.</p>
<h4><span class="ez-toc-section" id="3_A_Small_Linux_World_Within_Windows_My_WSL2_Installation_Journey_%F0%9F%90%A7"></span><span class="ez-toc-section" id="3_A_Small_Linux_World_Within_Windows_My_WSL2_Installation_Journey_%F0%9F%90%A7"></span><strong>3. A Small Linux World Within Windows: My WSL2 Installation Journey <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f427.png" alt="🐧" class="wp-smiley" style="height: 1em; max-height: 1em;" /></strong><span class="ez-toc-section-end"></span><span class="ez-toc-section-end"></span></h4>
<p>I learned that `WSL2 (Windows Subsystem for Linux 2)` was essential for installing `Docker` on my home server. This is because `Docker Desktop` uses the `WSL2` backend to run Linux-based containers on Windows. At first, I was worried it might be complicated, but I entrusted the installation to <strong>Claude Code</strong>, and it handled everything seamlessly.</p>
<p>Opening PowerShell with administrator privileges and entering the `wsl &#8211;install` command automatically installed `WSL` along with a default `Linux` distribution (for me, `Ubuntu`). Even setting `WSL2` as the default version after rebooting was handled by <strong>Claude Code</strong> without any fuss, leading to a successful and quick setup! It felt amazing to have my own mini Linux server within Windows. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f929.png" alt="🤩" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>
<h4><span class="ez-toc-section" id="4_The_Magic_of_Containers_Docker_Desktop_Installation_and_Integration_%F0%9F%90%B3"></span><span class="ez-toc-section" id="4_The_Magic_of_Containers_Docker_Desktop_Installation_and_Integration_%F0%9F%90%B3"></span><strong>4. The Magic of Containers: Docker Desktop Installation and Integration <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f433.png" alt="🐳" class="wp-smiley" style="height: 1em; max-height: 1em;" /></strong><span class="ez-toc-section-end"></span><span class="ez-toc-section-end"></span></h4>
<p>With `WSL2` installed, it was time to install `Docker Desktop`, the core of my home server. `Docker Desktop` is a truly powerful tool that enables easy building and running of Linux-based containers on `Windows` via the `WSL2` backend.</p>
<p>I downloaded the `Docker Desktop for Windows` installer from the official `Docker` website and began the installation. During the process, I carefully ensured that the <strong>&#8221;Use WSL 2 instead of Hyper-V&#8221;</strong> option was selected. After installation, I went to the `Resources > WSL Integration` tab in `Docker Desktop` settings and enabled integration with the `Ubuntu` distribution. <strong>Claude Code</strong> took care of all these steps automatically, so I simply had to observe.</p>
<p>Finally, when I opened the `Ubuntu` terminal and entered the `docker &#8211;version` and `docker run hello-world` commands, I felt a sense of accomplishment seeing the &#8220;Hello from Docker!&#8221; message. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f389.png" alt="🎉" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Now, even complex server environments can be managed simply at the container level!</p>
</p>
<h4><span class="ez-toc-section" id="5_Conclusion_Taking_the_First_Step_in_Building_My_Home_Server_%F0%9F%92%96"></span><span class="ez-toc-section" id="5_Conclusion_Taking_the_First_Step_in_Building_My_Home_Server_%F0%9F%92%96"></span><strong>5. Conclusion: Taking the First Step in Building My Home Server <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f496.png" alt="💖" class="wp-smiley" style="height: 1em; max-height: 1em;" /></strong><span class="ez-toc-section-end"></span><span class="ez-toc-section-end"></span></h4>
<p>Thus, starting with the <strong>SER9 MAX</strong>, I successfully took the first step in building my own home server by installing `Windows 11`, `WSL2`, and `Docker`. Throughout this entire process, <strong>Claude</strong> and <strong>Claude Code</strong> were like capable assistants, with <strong>Claude</strong> providing accurate information and <strong>Claude Code</strong> executing the commands, which was incredibly reassuring. I realized that even someone like me, who knows little about code or computers, can achieve this. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f91d.png" alt="🤝" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>
<p>In the next installment, I plan to discuss how to deploy various home server services using `Docker Compose` on the environment built today, and how to configure network settings for secure external access. Please look forward to it! <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f609.png" alt="😉" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>
<p>&#8212;</p>
<p><a class="a2a_button_facebook" href="https://www.addtoany.com/add_to/facebook?linkurl=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en%2F&amp;linkname=%5BComputer%20Play%5D%20Even%20a%20Code-Illiterate%20Built%20It%21%20My%20Home%20Server%20Journey%20%281%29%20%E2%80%93%20Starting%20with%20SER9%20MAX%2C%20Windows%2011%2C%20WSL2%2C%20and%20Docker%20%F0%9F%92%BB%F0%9F%9A%80%20%28feat.%20Claude%20%26%20Claude%20Code%29" title="Facebook" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_mastodon" href="https://www.addtoany.com/add_to/mastodon?linkurl=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en%2F&amp;linkname=%5BComputer%20Play%5D%20Even%20a%20Code-Illiterate%20Built%20It%21%20My%20Home%20Server%20Journey%20%281%29%20%E2%80%93%20Starting%20with%20SER9%20MAX%2C%20Windows%2011%2C%20WSL2%2C%20and%20Docker%20%F0%9F%92%BB%F0%9F%9A%80%20%28feat.%20Claude%20%26%20Claude%20Code%29" title="Mastodon" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_email" href="https://www.addtoany.com/add_to/email?linkurl=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en%2F&amp;linkname=%5BComputer%20Play%5D%20Even%20a%20Code-Illiterate%20Built%20It%21%20My%20Home%20Server%20Journey%20%281%29%20%E2%80%93%20Starting%20with%20SER9%20MAX%2C%20Windows%2011%2C%20WSL2%2C%20and%20Docker%20%F0%9F%92%BB%F0%9F%9A%80%20%28feat.%20Claude%20%26%20Claude%20Code%29" title="Email" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fprsm-studio.com%2Fen%2Fcode-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en%2F&#038;title=%5BComputer%20Play%5D%20Even%20a%20Code-Illiterate%20Built%20It%21%20My%20Home%20Server%20Journey%20%281%29%20%E2%80%93%20Starting%20with%20SER9%20MAX%2C%20Windows%2011%2C%20WSL2%2C%20and%20Docker%20%F0%9F%92%BB%F0%9F%9A%80%20%28feat.%20Claude%20%26%20Claude%20Code%29" data-a2a-url="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/" data-a2a-title="[Computer Play] Even a Code-Illiterate Built It! My Home Server Journey (1) – Starting with SER9 MAX, Windows 11, WSL2, and Docker &#x1f4bb;&#x1f680; (feat. Claude &amp; Claude Code)"></a></p><p>The post <a href="https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/">[Computer Play] Even a Code-Illiterate Built It! My Home Server Journey (1) &#8211; Starting with SER9 MAX, Windows 11, WSL2, and Docker 💻🚀 (feat. Claude &#038; Claude Code)</a> appeared first on <a href="https://prsm-studio.com/en">Prsm Studio</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://prsm-studio.com/en/code-illiterate-home-server-build-1-ser9max-windows11-wsl2-docker-en/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
