Location>code7788 >text

CMake Build Study Notes 14 - Dependency Management Tools

Popularity:794 ℃/2024-09-03 21:06:34

If there is a biggest pain point in doing C/C++ development, it must be the lack of an official unified package manager. Seriously, if you're going to do something with C/C++, you need at least (under Windows):

  1. The C/C++ language itself, the standard libraries, and the OS APIs can do little unless you really want to build wheels from scratch.
  2. Start looking for some off-the-shelf implementations to form a dependency library. It's best to see if you can find pre-compiled packages or installers, and even if you do, you won't necessarily be able to use them due to binary compatibility issues.
  3. If you don't find a pre-compiled package or an installer, you'll need to build from source yourself. It's good if CMake is provided, but if it's not, you'll have to figure out how to organize the project and build it yourself.
  4. Note that the dependency libraries themselves are required to depend on the library! For example, when you build GDAL you find out that PROJ is a mandatory dependency for GDAL, and when you start building PROJ you find out that sqlite3 is a mandatory dependency for PROJ, and when you're ready to build sqlite3 you find out that sqlite3 doesn't provide a CMake method.
  5. Not to talk about the set of issues dealt with during the build process. Once you have the dependency libraries built, you have to think about how to introduce them. If you're using dynamic link libraries, you'll need to do header files, dynamic library imports, and dynamic library-related configuration. If the header files are wrong, you'll find that you can't compile; if the dynamic library importer is wrong, you'll find that you can't link; and if the dynamic library is incorrect, you'll find that you can't run it.
  6. Finally start including header files in the source code to call dependent library related functions for operation.

Well, to be honest, I feel a little sweaty just writing these steps. If this were to use Python or JavaScript development, an install command, an import statement will be all done, no wonder the efficiency of C / C + + development has been criticized by programmers it. However, the C/C++ field is not always in the stagnant, Windows systems can also use some package managers, such as vcpkg, Conan, Chocolatey and so on. Personally, I think that these package managers are gradually maturing process, but still need some time to improve, interested students can try.

Another way is to try to organize a dependency repository management tool for yourself or your team, as the author did. There are three reasons for doing so:

  1. There are binary compatibility issues with C/C++ packages in different environments.
  2. BuildRelease build results with debugging information, and symbol library files.
  3. Some library packages are rare, and generic package managers don't always include them.

So how do we accomplish this? In fact, we do not have to think too complicated, we will build all the required builds to the same directory, and set this directory as an environment variable. This way, we can rely on this environment variable to configure dependent libraries in our daily programs. This means that the configuration of all team members' code projects can be the same, and we can ignore the different hardware and software environments and realize the consistency of our code projects.

So the key is to organize these common component builds. You can't use CMake's GUI tool, because different libraries each have their own unique build options, and it's best to document them via scripts. It's not a bad idea to script the builds a bit better and automate them a bit, where the code files come from and where the final build output goes. For example, the full Powershell script libzip.ps1 for building the libzip library in the previous blog post is:

param(
    [string]$SourceAddress = "/nih-at/libzip/archive/refs/tags/v1.10.",
    [string]$SourceZipPath = "../Source/libzip-1.10.",
    [string]$SourceLocalPath = "./libzip-1.10.1",
    [string]$Generator,
    [string]$MSBuild,
    [string]$InstallDir,
    [string]$SymbolDir
)

# Check if the target file exists,to determine whether to install
$DstFilePath = "$InstallDir/bin/"
if (Test-Path $DstFilePath) {
    Write-Output "The current library has been installed."
    exit 1
}

# Create containers for all dependent libraries
. "./BuildRequired.ps1"
$Librarys = @("zlib")
BuildRequired -Librarys $Librarys

. "./DownloadAndUnzip.ps1"
DownloadAndUnzip -SourceLocalPath $SourceLocalPath -SourceZipPath $SourceZipPath -SourceAddress $SourceAddress

# Clear the old build directory
$BuildDir = $SourceLocalPath + "/build"
if (Test-Path $BuildDir) {
    Remove-Item -Path $BuildDir -Recurse -Force
}
New-Item -ItemType Directory -Path $BuildDir

# Go to the build directory
Push-Location $BuildDir

try {
    # configureCMake
    cmake .. -G "$Generator" -A x64 `
        -DCMAKE_BUILD_TYPE=RelWithDebInfo `
        -DCMAKE_PREFIX_PATH="$InstallDir" `
        -DCMAKE_INSTALL_PREFIX="$InstallDir" `
        -DBUILD_DOC=OFF `
        -DBUILD_EXAMPLES=OFF `
        -DBUILD_REGRESS=OFF `
        -DENABLE_OPENSSL=OFF

    # building phase,Specify the build type
    cmake --build . --config RelWithDebInfo

    # installation phase,Specify the build type和安装目标
    cmake --build . --config RelWithDebInfo --target install

    # Copy Symbol Library
    $PdbFiles = @(
        "./lib/RelWithDebInfo/"
    )
    foreach ($file in $PdbFiles) {
        Write-Output $file
        Copy-Item -Path $file -Destination $SymbolDir
    }    
}
finally {
    # Return to the original working directory
    Pop-Location
}

Powershell scripts are really powerful, you can even introduce functions from third-party scripts, for example, BuildRequired means pre-installing the dependencies of the current library (which actually calls the build script of other libraries), and DownloadAndUnzip means downloading the source code from a remote location and unzipping it to a specified folder. Next, we create the build folder, configure the CMake project, build the project, and install the project. Finally, we move the library symbol library to the installation directory.

Just like this, you can write a script to build the required dependencies one by one. It doesn't matter if you need to change the build options sometimes, you can just change the corresponding content and rebuild it, that's the advantage of writing scripts. However, this one by one call scripts can not be said to manage the library package, compared to some of the more sophisticated package manager such as npm, you can write a total for the management of the script, the above build scripts to manage, as shown in the following Powershell script BuildCppDependency.ps1:

param(
    [string]$Generator,
    [string]$MSBuild,
    [string]$InstallDir,
    [string]$SymbolDir,
    [string]$Install,
    [string]$List
)

# Create containers for all libraries
$LibrarySet = [[string]]::new()
$("zlib") > $null
$("libpng") > $null
$("libjpeg") > $null
$("libtiff") > $null
$("giflib") > $null
$("freetype") > $null
$("OpenSceneGraph") > $null
$("eigen") > $null
$("osgQt5") > $null
$("osgQt") > $null
$("minizip") > $null
$("libzip") > $null
$("opencv") > $null
#$("protobuf") > $null
#$("abseil-cpp") > $null

# Check to see if you're passing the$Installparameters
if ($('Install')) {
    # Ignore case when comparing
    if ($() -eq "-all".ToLower()) {
        Write-Output "All libraries will be installed soon..."
        foreach ($item in $LibrarySet) {
            Write-Output "Find the library named $item and start installing..."        
            # Dynamically build script filenames and execute them
            $BuildScript = "./$item.ps1";
            & $BuildScript -Generator $Generator -MSBuild $MSBuild -InstallDir $InstallDir -SymbolDir $SymbolDir
        }
    }
    else {
        # Find a string
        if ($("$Install")) {
            Write-Output "Find the library named $Install and start installing..."        
            # Dynamically build script filenames and execute them
            $BuildScript = "./$Install.ps1";
            & $BuildScript -Generator $Generator -MSBuild $MSBuild -InstallDir $InstallDir -SymbolDir $SymbolDir
        }
        else {
            Write-Output "Cannot find library named $Install !"
        }
    }
}
elseif ($('List')) {
    if ($() -eq "-all".ToLower()) {
        Write-Output "The list of all libraries that can currently be installed in the repository is as follows:"
        foreach ($item in $LibrarySet) {
            Write-Output $item
        }
    }
}
else {
    Write-Host "Please enter the parameters!"
}

Once again, you can marvel at the power of Powershell scripts, you can even see that yes, this is the container provided by .Net, which can be invoked through Powershell as well. The above script provides basic viewing and installing functionality, for example to see what libraries can be installed, use the following command:

./BuildCppDependency.ps1 -List -all

Install specific libraries:

./BuildCppDependency.ps1 -Generator "Visual Studio 16 2019" `
-MSBuild "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\Bin\" `
-InstallDir "$env:GISBasic" `
-SymbolDir "$env:GISBasic/symbols" `
-Install libzip

Install all libraries:

./BuildCppDependency.ps1 -Generator "Visual Studio 16 2019" `
-MSBuild "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\Bin\" `
-InstallDir "$env:GISBasic" `
-SymbolDir "$env:GISBasic/symbols" `
-Install -all

In fact, a perfect package management tool requires a lot of features or very complex. For example, the installation of the package is very easy, how to uninstall it? How to upgrade and downgrade? Is it possible to combine with the IDE to automatically import dependencies and configure them? These issues, let's leave it for later consideration.

Finally, let's just contribute the dependency library management tool build script that was written for your reference:address