'Git submodules vs Nuget packages

Our team has experimented with git submodules for some core CRUD functionality shared by most of our products. We have also successfully used Nuget packages (self-hosted now) for some common utilities.

Our core functionality changes often enough that keeping submodules properly committed is proving to be more of a chore that we expected. I am considering moving the core functionality from a submodule to a Nuget package but am wondering if the frequent updates to the packages would be even more of a pain in Nuget.

Does anyone have any experience and guidance as to what other challenges I might encounter before making this slightly intrusive change to our architecture and process?



Solution 1:[1]

I prefer using submodules over Nuget packages for frequently changing internal libraries. Here's why:

  • Merging: If several developers make changes to the same library at the same time, with submodules, these changes can be merged. With Nuget packages, obviously there's no concept of merging.

  • Less wait: With submodules, you push, and then pull with whatever repo you need to use the submodule in. Usually a few seconds. With Nuget you must wait for the package to be be published, typically after your CI process completes.

  • Versioning clashes: Again, if several developers make concurrent changes, they may increment the Nuget version # to the same one, despite their changes being different. How much of a pain this is to address depends on your CI process.

  • Debugging: Since Nuget packages are compiled, you can't step into their code.

  • Can make changes in "client" repos: Sometimes it's easiest to flesh out the details of library changes while working on client code that will use the library. With submodules, this is possible. Of course, this is no substitute for test coverage.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1