Fine-Tuning LLMs on a Local Multi-GPU AI Workstation
Articles

Fine-Tuning LLMs on a Local Multi-GPU AI Workstation

What fun is having an AI workstation if you don’t dive deeply into what makes AI work. Inference is maybe 98% of AI tasks from image generation to machine learning and even basic chats and summaries. We rely on model makers to do the heavy lifting and provide models that we can use, either in the cloud or locally.

Software Choices for a Multi-GPU AI Workstation
Articles

Software Choices for a Multi-GPU AI Workstation

In my previous articles I discussed the balance between model quality and hardware requirements for running AI models locally. For this article I wanted to discuss software and how my AI Workstation build was designed to work with the software packages and how I can switch between them to complete certain tasks.

Building a Multi-GPU AI Workstation on a Budget
Articles

Building a Multi-GPU AI Workstation on a Budget

Building an AI workstation is rarely about buying the single fastest component on the market; it is about finding the specific "sweet spot" that fits your workload. Whether you are prioritizing VRAM capacity, PCIe lane availability, or thermal management, the right build depends entirely on how you plan to use your machine. The goal isn't just raw power, but the ability to run the tools you need efficiently.

Balancing Model Quality and Hardware Demands in AI Workstations
Articles

Balancing Model Quality and Hardware Demands in AI Workstations

This is the first article in a new AI Hardware series talking about the requirements for running AI Models and why paying attention to these things can help you build a better AI Workstation. If anything learning why a DGX Spark or Mac Mini work but, can also be really slow.

Lexar Celebrates their 30-year Anniversary at CES 2026
Articles

Lexar Celebrates their 30-year Anniversary at CES 2026

2026 marks the 30th year for Lexar and they started CES off with an announcement that they were going to be the global storage partner for the Argentina National Football (Soccer) Team and launched a number of AI focused storage devices.

Latest Articles and Events

  • Fine-Tuning LLMs on a Local Multi-GPU AI Workstation

    Published: Wednesday, April 8, 2026 - General Information Fine-Tuning LLMs on a Local Multi-GPU AI Workstation

    What fun is having an AI workstation if you don’t dive deeply into what makes AI work. Inference is maybe 98% of AI tasks from image generation to machine learning and even basic chats and summaries. We rely on model makers to do the heavy lifting and provide models that we can use, either in the cloud or locally.

  • Software Choices for a Multi-GPU AI Workstation

    Published: Tuesday, April 7, 2026 - General Information Software Choices for a Multi-GPU AI Workstation

    In my previous articles I discussed the balance between model quality and hardware requirements for running AI models locally. For this article I wanted to discuss software and how my AI Workstation build was designed to work with the software packages and how I can switch between them to complete certain tasks.

  • Building a Multi-GPU AI Workstation on a Budget

    Published: Monday, March 30, 2026 - General Information Building a Multi-GPU AI Workstation on a Budget

    Building an AI workstation is rarely about buying the single fastest component on the market; it is about finding the specific "sweet spot" that fits your workload. Whether you are prioritizing VRAM capacity, PCIe lane availability, or thermal management, the right build depends entirely on how you plan to use your machine. The goal isn't just raw power, but the ability to run the tools you need efficiently.

  • Balancing Model Quality and Hardware Demands in AI Workstations

    Published: Friday, March 27, 2026 - General Information Balancing Model Quality and Hardware Demands in AI Workstations

    This is the first article in a new AI Hardware series talking about the requirements for running AI Models and why paying attention to these things can help you build a better AI Workstation. If anything learning why a DGX Spark or Mac Mini work but, can also be really slow.

  • Lexar Celebrates their 30-year Anniversary at CES 2026

    Published: Saturday, January 17, 2026 - Events Lexar Celebrates their 30-year Anniversary at CES 2026

    2026 marks the 30th year for Lexar and they started CES off with an announcement that they were going to be the global storage partner for the Argentina National Football (Soccer) Team and launched a number of AI focused storage devices.

  • Hardware Asylum at CES 2026

    Published: Wednesday, January 14, 2026 - Events Hardware Asylum at CES 2026

    According to CES 2026 material there were over 4100 exhibitors registered to attend along with several unknowns looking to take advantage of CES by hosting their own events in area hotels. It is not practical to see everything but with a good pair of shoes and a crisp Pepsi I attempted to do the impossible.