-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[feature] Generate multiple packages from single compilation #3106
Comments
Breaking the 1-1 relationship between recipe and reference would be extremely difficult. There are still many unknowns in what you are suggesting. For example: what happens with dependencies? If you have a monolithic build and want to split into different package builds, means that most times there will be relationships between the packages. How to model that in a single conanfile? You suggest in the build() method, but the build method is only called when necessary to build from source, but what happens when you want to install pre-built binaries? Also, if it is necessary to build from sources those packages, and they are independent, how to avoid CI re-building it again and again, if they are built in parallel? The only possible solution to this problem is already implemented in conan:
|
Why not just copy model or its part from some existing package manager. debian's for example. The hack with build_require doesn't solve p.2. but in fact getting subset of deps could be expressed in terms of other more simpler feature request. My goal is to carefully change/sed sources before build. For this I have to know dependencies for each specific subproject. |
You mean https://wiki.debian.org/PkgSplit? Extracted from there:
Now multiply that with having to support multiple build systems and compilers, conditional dependencies in different OS, etc... It is simply not feasible without an incredibly high complexity into the conan codebase that will compromise the whole project functioning and future maintenance.
I don't fully understand why it doesn't solve it. In each package recipe you can specify the dependencies you want. An ordered set of conanfiles and some helper scripts might do the task decently. Which is the project you want to do this? What would be the sources and the different packages you want to create from it? Maybe the best would be to setup a git repo with a proof of concept of the suggested build_requires and several consumer recipes, and collaborate on it. |
Example We compile ProjectX. ProjectX depends on Dep1 and Dep2. ProjectX has single conanfile.py and comprised of 2 subprojects (read: has single *.sln files and 2 *.vcxproj files). Subproj1 doesn't depend on Dep2 as well as Subproj2 doesn't depend on Dep1. So we have dependencies Dep1>Subproj1 and Dep2>Subproj2. Combined package_info of Dep1 and Dep2 has some intersecting flags leading to compilation failure being applied to both Subproj1 and Subproj2. The goal is to take compilation flags/libs of Dep1 and patch only Subproj1 (generate proper *.props file or use replace_in_file), then do the same for Subproj2 and only then we can compile ProjectX properly. |
Isn't it the
Isn't it the problem that some flags are not propagated? Isn't it the different problem? |
Is there any instruction how to use this approach with appveyor and "conan create"?
No. Let's assume ProjectX is a OpenSSL wrapper and it has two plugins one for OpenSSL 1.0 and another one for OpenSSL 1.1. So ProjectX will have all openssl libs in dependencies and all paths to openssls will be added to LIB by Conan. Also it's possible both versions of OpenSSL will have the same library names (e.g. ssl.lib). Now we have now any guarantee Plugin1 will be linked with OpenSSL 1.0 and Plugin2 will link OpenSSL 1.1. One of them will definitely link with wrong version of OpenSSL leading to compilation failure. |
No,
This looks like a problem. In the general case you can't have 2 versions of the same library in the same dependency graph. |
I'm going to hijack this feature request, because the title matches what I want to do but the implementation is a little bit different from what @Ri0n described. I'm a member of the KDE community and I'm working on adopting Conan in KDE projects. We have several (many) projects called frameworks.
Example: The KConfig framework produces 2 libraries: Linux distributions build this framework and the package each libraries separately. We want to do the same thing with Conan, and keep the We've looked over having @memsharded do you have any recommendations for us? |
The same as @ovidiub13 says also applies to Qt. Currently there is a conan package by the community, but that's a monolithic big one, we should instead take advantage of clean dependencies of the individual libraries, so people can get the parts they need, especially for deployment that would be a lot easier. There are several libraries living in one git repository and we could even add conan files in the repos, but for that to be viable, it would be great if one conan file could produce several packages/artifacts. |
Hi, thanks @gladhorn and @ovidiub13 for the feedback. There could be different approaches:
I know this is a big investment, and might not be feasible in the short term.
@ovidiub13
The SCM is quite new feature, the github issue is already reported, so this might be solved when possible. |
I have some thoughts kinda related to the whole monolith/multiple packages thing. @memsharded Solution of having one single monobuild for generating multiple subpackages is great aside from one little thing that got me confused. I have been thinking about the situation where I do have a dependency with, for example, a single There are multiple possible use cases with such schema (boost, Qt, KDE, Bloomberg' package groups) and I think it'd make sense to introduce a feature that would allow producers to hint Conan dependency tree that they are related. For example, in one big But the trick is that something like that would require intrusiveness into the recipe to detect the conflict. And there was that thing about ABI compatibility defined by the producer instead of the consumer like now (which is not really UX friendly) which also requires the same intrusiveness, so I'm interested in your opinions in this whole thing. |
@memsharded I agree with your vision that the monolithic repos exist currently due to a lack of a package manager, and keeping this in mind, I'm hoping that if we find the right solution for this, and write a documentation page in this section, we will soon see monolithic projects get modularized, not just in concept, but also in sources. For KDE and I guess also Qt, option 1 would not really work, because that requires having the conan recipes in their own repos, or at least in a separate location than the actual sources, which would greatly motivate people not to use them or maintain them. Especially during the "trial" period. Keeping in mind the first paragraph of this comment and @memsharded's options 2 and 3 , I'm imagining one solution could be like this:
This is of course pseudo arrangement, as the repo doesn't actually look like this currently. The The The problem with this approach (having the Another option, is to have a single This also brings the problem of how do you specify the iner dependencies... It's late for me... @memsharded can you draft the POC you've mentioned, so we have a base to discuss on? |
I'm trying to generate multiple packages from a big CMake build tree according to approach 3 in the @memsharded's comment. So I created from conans import ConanFile, CMake
class BuildConan(ConanFile):
name = "Build"
scm = {
"type": "git",
"url": "auto",
"revision": "auto",
"submodule": "recursive"
}
settings = "os", "compiler", "build_type", "arch"
generators = "cmake"
build_requires = # packages needed for the build...
def build(self):
cmake = CMake(self)
cmake.configure()
cmake.build()
def package(self):
# copying build artifacts into <Artifact_name>/ folders and from conans import ConanFile
class ModuleConan(ConanFile):
settings = "os", "compiler", "build_type", "arch"
keep_imports = True
def build_requirements(self):
self.build_requires("Build/%s@%s/%s" % (self.version, self.user, self.channel))
def imports(self):
self.copy("*", src=self.name, dst=self.name)
def package(self):
self.copy("*", src=self.name) This is almost universal as
using the same But some artifacts requires specific dependencies ( from conans import python_requires
base = python_requires("MyBase/0.1@user/channel")
class PkgTest(base.MyBase):
pass I cannot provide one; the necessary version can only be obtained inside the class. Is there any way around that? |
You could write a full dict with the dependencies of every module and use in the common recipe, without using python requires, that in your case I think it doesn't make much sense. |
@lasote, indeed. Thanks a lot for the hint! |
Just now reading this thread. Started out believing a new feature will be required if we’re ever going to properly handle existing upstream projects which are superprojects and won’t be changing their methodology any time soon. Unfortunately, this feature is likely to be really complex, and will almost certainly not be here within the next year (if everyone agreed on an implementation). I briefly got pretty excited about @memsharded clever suggestion regarding build_requires. Seems a good attempt to “make it work with what we’ve got”. But thanks to others’ points, I remembered how non-trivial the problems are, and it seems likely to lead to something with a bunch of new caveats. I will continue thinking about the problem. |
I think one desire for this feature is to mirror the debian dpkg concepts of Source vs Binary package: https://www.debian.org/doc/debian-policy/ch-controlfields.html |
I haven't read the document you sent, but that's where my proposal comes from. |
Although it is not the same approach, IMO this issue is related to #5242, we are implementing a new feature to model the dependencies of libraries inside a single package, and we will provide also the ability to consume only one of those libraries. That PR is a draft right now, but I hope it will be ready soon. Components feature is not about splitting a single compilation into several packages, but it will allow consuming a single library from a package with several ones. The resulting scenario could be equivalent from the consumer point of view (although they will download several artifacts to use only one). |
On this topic, there's one point I think it's valuable to recognize for somewhat novice users of Conan. It's something we realized during the process of making modular Boost recipes, and which is even more pronounced for the current Monolithic Qt recipe which we did not modularize. There have always been Conan features which enabled package authors to expose a group of pre-compiled "monolithic" binary artifacts as a bunch of independently consumable binary packages. However, the thing that its easy to forget about when trying to provide such binary packages is: "What happens when users pass With this in mind, consider Boost and Qt. One might get the idea that the best solution is to just have one recipe which "builds the whole thing" (because that is what their build systems want to do by default), and then a bunch of downstream wrapper packages which just expose the components by name. So, with some advanced logic in the The are multiple major problems with:
In summary, I am very glad Conan team is going to move forward with a first-class feature to tackle this very real and practical challenge. I don't know how you solve the problem and avoid the challenges above, but if they can figure out a way, it might transform how we think about many of the most popular OSS packages, and perhaps some enterprise ones as well. |
Is there any example(s) how the repackaging should happen in the recipe that build_requires the monobuild? How the other recipes can access the build contents and re-package those? Say, the monobuild produces: lib/libA.so How the "package()" implementation of the individual components conanfile.py should look like? How it can access the build contents of the monobuild in another conan package? |
Hi, @unzap. This is still an open topic with no final answer regarding big packages like Still, I see the use-case where a company wants to repackage a bigger thing into smaller packages. The most important here is not to propagate information from the big package to the smaller ones, each package should be able to choose the bits of information it wants to propagate. There have been different approaches:
Summing up, build-requires-host can provide the best level of isolation, but they copy binaries and storage increases. Component approach might not be suitable for every generator/build-system and it will expose all the information from the big-package (although it won't be propagated by Conan). |
Hi @jgsogo and thanks for the tip above! After some debugging I can see that "build_requires" pulls the whole depedency into the "/build" directory of the consuming/split package.
So consumers of split_component_A can just conan install the "split_component_A/1.0". |
Hmm, if I use --install-folder=/foo for a split component then the whole "build_requires" ends up in "foo/"? Shouldn't it copy only the contents of the split components package? Inside conan cache I can see that the contents inside the "package//" of the split component is correct, i.e. it contains only the files belonging to the split component, not the whole "build_requires" (monobuild). Maybe I have done something wrong in the recipes...? |
No, packages are unitary, there is no way to split them... but using components you can select which components to link from your consumers (but all the packages will be downloaded, installed,...). Try with |
But you mentioned earlier:
And based on the given snippet I tested the approach where there is one conanfile_monobuild.py to do the actual build, then conanfile_module_a.py and conanfile_module_b.py. The "package/" directory contents for module_a and module_b look correct, i.e. "package/" directory of the module_a contains only files that belong to module_a. Similarly for module_b. I used the fnmatch filter possibility in "self.copy(...)":
So to summarize: /home/user/.conan/data/monolitemodule/1.0/user/channel/package/58687..963299d/ # contains all files All looks good above. But:
The build_dir will contain all the files from the monobuild? The same result happens if I use "--install-dir" in conjunction with the above command? Is this because I've used this in the recipe as well:
I've used the copy_deps() in the recipes. But should that pull in also the "build_requires" (i.e. the monobuild in this case)? |
Then you have When you run |
Closing as solved, please create new tickets if necessary, thanks |
It's quite a common task for distros' package managers to compile once and split binaries into multiple packages. But such a common and extremely useful feature for some reason is not supported by Conan.
So when it's required to generate multiple packages real magic has to be applied like changes in upstream sources to be more compatible with Conan, maintaining multiple conanfiles.py in different directories duplicating most of information. some trickery to make all of this working on 3rdparty CI like appveyor and others. Recompilation of some common part of sources for each next subproject. It becomes even harder when a project depends on a set of other libraries each of them having a little different dependencies.
So I'm looking forward you guys can handle this somehow. So I basically see two major tasks:
The text was updated successfully, but these errors were encountered: