What you should be doing is using the control-repo pattern similar to https://github.com/puppetlabs/control.... In that case, you have puppet-librarian or r10k which will pull down any modules into the appropriate environment, and the 'self contained' modules live under the 'site' folder, and your environment.conf is updated to add the additional folder of 'modules' to your modulepath.
You could even use a tool like librarian-puppet or r10k during build time to pull in your dependencies (i.e. from an internal git mirror of the repository) and create a deployment artifact that you ship to your puppet master which contains all of the code. There is an example of that being done in the Scaling Puppet on AWS ECS With Terraform and Docker talk from puppetconf 2016.
In that talk, he uses puppet-librarian/r10k to pull in all of the modules locally, build a .deb package with all of his module configuration, and he makes that package available to his puppet master (which is running a single environment inside of a docker container).
There is nothing, however, stopping you from simply having a 'monorepo' with all of your puppet modules in it, copying over the version of the externals you want to use, and just having them embedded directly into your baseline. At the end of the day, all that is require to get puppet to work is to have all of the modules, available within your 'environment', a.k.a. able to be resolved on the modulepath for the puppet master.