# Pinning GitLab Inputs | Nixtamal ┏┓╻+╻ ╱┏┳┓┏┓┏┳┓┏┓╻ ┃┃┃┃┗━┓╹┃╹┣┫┃┃┃┣┫┃ ╹┗┛╹╱ ╹ ╹ ╹╹╹ ╹╹╹┗┛ Home Install Manpage Changelog Roadmap Cookbook Real-world showcase Community FAQs Funding Pinning GitLab Inputs GitLab is a open-core Git forge developed broadly by GitLab Inc. . It comes in both community & commericial editions (which can be self-hosted). There is also a U.S. -based gratis instance at https://gitlab.com/users/ ( terms apply ). How to pin a GitLab repository with Nix + Nixtamal Let’s show the way using the highlight repository. When you write this, you will substitute the owner & project slugs. In most cases, you will want to prefer fetching the archive (GitLab supports bzip ) over the Git input kind. git ls-remote is preferred by us for being more generic. Latest revision // manifest.kdl inputs { highlight { archive { url "https://gitlab.com/saalen/highlight/-/archive/{{fresh-value}}/highlight-{{fresh-value}}.tar.bz2" } } fresh-cmd { $ git ls-remote --branches master "https://gitlab.com/saalen/highlight.git" | cut -f1 } } Latest stable tagged version // manifest.kdl inputs { highlight { archive { url "https://gitlab.com/saalen/highlight/-/archive/{{fresh-value}}/highlight-{{fresh-value}}.tar.bz2" } } // Looks for tag starting with v[0-9] since some tags didn’t take on // this version structure but also must end in [0-9] to remove alpha, // beta, release tags fresh-cmd { $ git ls-remote --tags --sort= v:refname "https://gitlab.com/saalen/highlight.git" | grep -E "'refs/tags/v([0-9]+ \\ .)+[0-9]+$'" | tail -n1 | sed "'s|.*refs/tags/||'" } } Site made with Nix ( dep management), Nickel ( config ), Soupault ( SSG ), Docutils ( rST rendering), mandoc (manpage conversion), & sugilite256 (color scheme). © 2025–2026 toastal . © 2026 Nixtamal contributors. Some rights reserved. Except where otherwise noted, the content on this website is licensed under CC-BY-SA-4.0 . Citations must attribute the work’s writer/maker & include a hyperlink to this website (or rather the work itself). Yes, these rules/clauses apply to LLM s & AI assistants too.