Autotools Mythbuster
Preface
Diego Elio "Flameeyes" Pettenò
Author and Publisher <flameeyes@flameeyes.eu> SRC=https://autotools.io/index.html
David J. "user99" Cozatt
Miscellaneous Editing <olbrannon@gmail.com>
Copyright © 2009-2013 Diego Elio Pettenò
This text is released under the CreativeCommons license Attribution-NonCommercial-ShareAlike 3.0 Unported. Please refer toAppendix B, License for the complete text of the license.
Abstract
Autotools Mythbuster is a no-nonsense guide to Autotools, written with the idea of providing a full, integrated view of the tools in the GNU build chain: autoconf, automake, libtool, pkg-config, and so on.
Instead of providing a full detailing of the workings of the components, which most consumers are not interested on, this guide focuses on building useful, solid and distribution-friendly packages, that developers can submit to distributions without having to jump through hoops to get them accepted.
Autotools Mythbuster is available as an eBook for those who want to read or reference it online. Profits from its sales are used to pay for hosting and domain name registration.
Autotools Mythbuster is available as an eBook for those who want to read or reference it online. Profits from its sales are used to pay for hosting and domain name registration.
You can find the book on all major Amazon websites, as well as Kobo and their franchises.
1. Foreword
This book is an organised collection of documentation on the suite of software usually referred to as autotools; this suite, developed by the GNU project, consists of tools such as autoconf, automake, libtool and other tools that were introduced by FreeDesktop.org, such as pkg-config.
This collection finds its roots in my blog. I've been writing notes, quick and dirty tutorials and commentaries about autotools usage and their practices. The intent is to facilitate solving common problems found during Gentoo Linux development. While these entries were always as accurate as I could make them; they had some obvious limits imposed by the form of blog they were written in.
Indeed, like most blogs; these entries were often opinionated and scattered over a long period of time. They were often interspersed with entries that have nothing to do with autotools or with development at all. This made it difficult to search for and/or access the material.
Trying to get the blog entries into a series of articles would probably have been just as difficult: it would have been extremely unlikely to find a publisher ready to accept such a document. I've thus decided to try to find a way to organise my notes and facilitate collaboration.
This document is going to be extended, updated, corrected and indexed so that the content is easily browsable and usable by everyone who is looking for information about autotools.
I honestly hope that this guide can be useful to others just as it has been useful for me to learn the subject. I am hopeful that this will improve the overall quality of build systems based on autotools, this way reducing the headaches that developers like me have to sustain to package new software.
2. Why another guide about autotools?
When I started writing this guide, there weren't many updated guides on autotools in general; the autoconf and automake manuals often referred to corner-cases rather than general ones, and made it difficult to understand how to write a proper configure.ac orMakefile.am.
The "autobook", by Vaughan, Elliston, Tromey and Taylor was pretty much obsolete by the time I started working in Gentoo, and John Calcote's book ([CalcoteAutotools]) was not out yet. I originally intended working myself on a complete book, but I couldn't find a publisher accepting it — partly because of my critical failure in being a native English speaker.
For a while I wanted to just drop it, but then the time I spend on fixing build systems for packages I maintain piled up enough that I decided spending time on the guide would probably be better employed. The idea behind my guide was to present an integrated guide that shows the best practices, rather than showing all the possibilities, including those that distributions would have a hard time dealing with.
This reason is why the document is spotty at best: it started by covering the issues that I had to fix more often, as a way to describe them to the upstream projects when submitting them. Over time, I have been working on filling the gaps, which should make for a more complete documentation project.
3. How does this work?
I'm writing this guide as a free and open book, released under CreativeCommons Attribution-ShareAlike-NonCommercial license (the whole content can be licensed under commercial licenses upon request). The availability of this book in source form is meant to allow rebuilding it in any format you see fit for reproduction and for submitting corrections and improvements.
My original idea was to work on this in my spare (but paid) time, accepting donations to extend the content. Unfortunately, this idea encountered no responses; which meant that the content was only extended in my spare time. Since I now cannot receive (substantial) monetary donations; extension will no longer be tied to that.
Unfortunately, the time I feel I can spend on this guide was also reduced drastically, and likely I will just extend it as needed to explain a particular problem, instead of writing a blog entry.
If you are interested in a particular topic related to autotools, you're still welcome to contact me at flameeyes@flameeyes.eu and make suggestions.
3.1. How to contribute to the guide
This guide is an open publication, if you wish to contribute, you're very welcome to do so, but you have to grant the same license as the rest of the content (see Appendix B, License).
The sources for this text are available, in DocBook format, at the Gitorious project.
If you are able to improve this guide grammatically or contextually, you're welcome to submit your changes. You can either send me the patches via email, or request a merge from the Gitorious interface.
Please note that if you wish to submit patches, you should follow the same style as the rest of the book. You're going to be credited among authors and editors after your second patch or if your contribution is substantial.
Chapter 1. Configuring The Build — autoconf
Configuring the build consists of running a series of tests to identify the build environment and the presence of the required tools and libraries. It is a crucial step in allowing portability between different operating systems to detect this build environment system. In the autotools chain, this is done by the autoconf tool.
The autoconf tool translates a configure.ac file, written in a mixture of m4 and shell scripting, into a configure POSIX shell script that executes the tests that determines what the build environment is.
1. M4sh
While the result of the autoconf stage is a complete script compatible with POSIX sh, the language used to write the configure.ac is called M4sh, to make clear that it's based off both sh and the macro language M4.
The reason the M4sh language is often confused with simple sh syntax is because it actually is based on that, augmented with the autoconf proper macros and a few more macros that replace part of the sh syntax.
The major change between the standard sh syntax and M4sh is found in two macros: AS_IF and AS_CASE. The two macros, as it's easy to guess, replace two constructs from sh: if..then..elif..else..fi and case..esac.
The two macros are actually replaced with the standard sh syntax after autoconf processing, but they exist for good reasons: they make it known to autoconf the conditional segments of the script. This in turn helps resolving issues with macros and conditionality.
1.1. M4sh syntax
The basic M4sh macros have a syntax that is directly translatable to sh syntax. It is thus easier to just see how the macros translate:
AS_IF([test], [true], [false])
if test; then
true
else
false
fi
AS_IF([test1], [true], [test2], [true2], [false])
if test; then
true
elif test2; then
true2
else
false
fi
As you can see, the parameters to the macro aren't in a fixed number: you can chain a series of alternative conditions as you would with the usual sh script.
The parameters are positional: the parameters in odd position (the first, the third, …) with the exception of the last one are the truth conditions (the tests); the parameters in even position are the conditional code executed if the previous parameter results true, and if there is a last odd parameter, it's considered the final alternative condition (the else condition).
Similarly, for AS_CASE:
AS_CASE([$variable], [foo*], [run1], [bar*], [run2], [catchall])
case $variable in
foo*) run1 ;;
bar*) run2 ;;
*) catchall ;;
esac
As for AS_IF, the parameters here are positional, but since the first parameter is reserved to be the argument to the case statement, the even places are for the compared expressions, and the odd ones (starting from the third) contain the code that is executed, conditionally. Finally, almost identical to the previous macro, the last parameter is the code executed when nothing else matched up.
2. Canonical Systems
When using autoconf, there are three system definitions (or machine definitions) that are used to identify the "actors" in the build process; each definition relates to a similarly-named variable which will be illustrated in detail later. These three definitions are:
host (CHOST)
The system that is going to run the software once it is built, which is the main actor. Once the software has been built, it will execute on this particular system.
build (CBUILD)
The system where the build process is being executed. For most uses this would be the same as the host system, but in case of cross-compilation the two obviously differ.
target (CTARGET)
The system against which the software being built will run on. This actor only exists, or rather has a meaning, when the software being built may interact specifically with a system that differs from the one it's being executed on (our host). This is the case for compilers, debuggers, profilers and analyzers and other tools in general.
To identify the current actors involved in the build process, autoconf provides three macros that take care of finding the so-called "canonical" values (see Section 2.1, "The System Definition Tuples" for their format): AC_CANONICAL_HOST, AC_CANONICAL_BUILD andAC_CANONICAL_TARGET. These three macros then provide to the configure script the sh variables with the name of the actor ($host,$build and $target), and three parameters with the same name to the configure script so that the user can override the default discovered values.
The most basic autoconf based build systems won't need to know any of these values, at least directly. Some other tools, such aslibtool, will require discovery of canonical systems by themselves. Since adding these macros unconditionally adds direct and indirect code to the configure script (and a dependency on the two support files config.sub and config.guess); it is recommended not to call them unconditionally.
It is actually quite easy to decide whether canonical system definitions are needed or not. We just have to look for the use of the related actor variable. For instance if the configure.ac script uses the $build variable, we would need to call AC_CANONICAL_BUILD to discover its value. If the system definition variables are used in a macro instead, we should use the AC_REQUIRE macro to ensure that they are executed before entering. Don't fear calling them in more than one place. See Section 6.2, "Once-Expansion" for more details.
One common mistake is to "go all the way" and always use the AC_CANONICAL_TARGET macro, or its misnamed predecessorAC_CANONICAL_SYSTEM. This is particularly a problem; because most of the software will not have a target actor at all. This actor is only meaningful when the software that is being built manages data that is specific to a different system than the one it is being executed on (the host system).
In practice, the only places where the target actor is meaningful are to the parts of a compile toolchain: assemblers, linkers, compilers, debuggers, profilers, analysers, … For the rest of the software, the presence of an extraneous --target option to configureis likely to just be confusing. Especially for software that processes the output of the script to identify some information about the package being built.
2.1. The System Definition Tuples
The system definitions used by autoconf (but also by other packages like GCC and Binutils) are simple tuples in the form of strings. These are designed to provide, in a format easy to parse with "glob masks"; the major details that describe a computer system.
The number of elements in these tuples is variable, for some uses that only deal with very low-level code, there can be just a single element, the system architecture (i386, x86_64, powerpc, …); others will have two, defining either the operating system or, most often for definition pairs, the executable format (elf, coff, …). These two formats though are usually, only related to components of the toolchain and not to autoconf directly.
The tuples commonly used with autoconf are triples and quadruples, which define three components: architecture, vendor andoperating system. These three components usually map directly into the triples, but for quadruple you have to split the operating system into kernel and userland (usually the C library).
While the architecture is most obvious; and operating systems differ slightly from one another (still being probably the most important data), the vendor value is usually just ignored. It is meant to actually be the vendor of the hardware system, rather than the vendor of the software, although presently it is mostly used by distributions to brand their toolchain (i386-redhat-linux-gnu) or their special systems (i386-gentoo-freebsd7.0) and by vendors that provide their own specific toolchain (i686-apple-darwin9).
Most operating systems don't split their definitions further in kernel and userland because they only work as an "ensemble": FreeBSD, (Open)Solaris, Darwin, … There are, though, a few operating systems that have a split between kernel and userland, being managed by different projects or even being replaceable independently. This is the case for instance of Linux, which can use (among others) the GNU C Library (GNU/Linux) or uClibc, which become respectively *-linux-gnu and *-linux-uclibc.
Also, most operating systems using triples also have a single standardised version for both kernel and userland, and thus provide it as a suffix to the element (*-freebsd7.0, *-netbsd4.0). For a few operating systems, this value might differ from the one that is used as the "product version" used in public. For instance Solaris 10 uses as a definition *-solaris2.10 and Apple's Mac OS X 10.5 uses *-darwin9.
2.2. When To Use System Definitions
To be extended
3. Adding Options
One of the most important features available to developers who use autoconf is certainly the ability to add new options to the./configure execution, to provide optional build-time support to users. Unfortunately, because of the importance of this feature, it's also one the most commonly misused.
There are three types of options (or properly arguments ) that can be added to the configure script:
--enable-*/--disable-* arguments
The arguments starting with --enable- prefix are usually used to enable features of the program. They usually add or remove dependencies only if they are needed for that particular feature being enabled or disabled.
--with-*/--without-* arguments
The arguments starting with --with- prefix are usually used to add or remove dependencies on external projects. These might add or remove features from the project.
environment variables
Environment variables that are used by the configure script should also be declared as arguments; their use will be explained below in detail.
The first two kinds of parameters differ just for the displayed name and from the macro used, but are in effect handled mostly in the same way. They both are actually used to pass variables, in the form of --(enable|with)-foo=bar and both provide defaults for when the variable is omitted (the value yes for --enable and --with and the value no for --disable and --without).
While there is no technical difference between the two, it's helpful for both users and distribution to follow the indications given above about the use of the two parameters' kind. This allows to identify exactly what the parameters are used for.
The environment variables are a recent addition to autoconf and are indeed used by a minority of the projects based on this build system.
3.1. AC_ARG_ENABLE and AC_ARG_WITH
For declaring the arguments with --enable and --with prefixes, you have two different macros that work in basically the same way:AC_ARG_ENABLE and AC_ARG_WITH. Because they work in the same way, the following explanation will only talk about the former, but the same applies for the latter.
Keeping in mind what has been said above, about the parameters actually taking a value, and defaulting to either yes or no, the parameters of the macro are as follows
AC_ARG_ENABLE(option-name, help-string, action-if-present, action-if-not-present)
option-name
Name of the argument, this will be used for both the actual argument option and for the variable to store the result in. It's useful to keep to a subset of characters here, since it'll be translated to a string compatible with sh variable names.
help-string
This is the string used to describe the parameter when running ./configure --help. Often it's passed raw directly to the macro, but that will likely make the text not align or fill properly in the help text. It's customary to use then the AS_HELP_STRINGparameter to create the string.
action-if-present
This is the M4sh code used when the user has passed a parameter through --enable-foo; the value of the parameter, if any, is given through the $enableval (or $withval) local variable.
action-if-not-present
This is the M4sh code executed when no parameter of any kind for the given option name has been given at ./configure; this allows to set the default value for variables that are otherwise calculated in the previous action.
Warning
The most common mistake for this macro is to consider the two actions as action-if-enabled and action-if-disabled.
This is not the case!
Since using --disable-foo or --enable-foo=no are equivalent, for the macro, you cannot really use this macro with those meanings.
For most uses, there is no actual need to define actions, since the default for autoconf when no action is defined for the case the user gives a parameter is to set a special variable named with the enable_ (or with_) prefix, like enable_foo.
Example 1.1. Using AC_ARG_ENABLE without actions
dnl Example of default-enabled feature
AC_ARG_ENABLE([foo],
AS_HELP_STRING([--disable-foo], [Disable feature foo]))
AS_IF([test "x$enable_foo" != "xno"], [
dnl Do the stuff needed for enabling the feature
])
dnl Example of default-disabled feature
AC_ARG_ENABLE([bar],
AS_HELP_STRING([--enable-bar], [Enable feature bar]))
AS_IF([test "x$enable_bar" = "xyes"], [
dnl Do the stuff needed for enabling the feature
])
In the above example, only the recognised options of no and yes (respectively for each case) are used; any other value given (e.g. --enable-foo=baz or --enable-bar=fnord) would be ignored and treated in the same way as the default value of no parameter given.
Further safety checking of the value to exclude anything but yes or no can be added, but is usually not necessary for the simplest cases.
3.1.1. The Help Strings
Some discussion of the help strings used for declaring parameters with both AC_ARG_ENABLE and AC_ARG_WITH is warranted; since past versions have changed autoconf and they tend to be often mistakenly used.
The second parameter for the two macros above is the description string as output by ./configure --help; since proper formatting of that string is almost impossible to achieve by hand, there is a macro that autoconf provides to generate it: AS_HELP_STRING(replacing the former, deprecated macro AC_HELP_STRING, which is virtually identical, but has a less clear name).
This macro takes care of properly aligning the text, breaking the lines where needed. For instance take the following fragment:
dnl configure.ac text…
dnl to give an example of default-provided help text
AC_HEADER_ASSERT
AC_ARG_ENABLE([foo], [ --enable-foo enable the use of foo through this very long and boring help text])
AC_ARG_ENABLE([bar],
AS_HELP_STRING([--enable-bar],
[enable the use of bar through this very long and boring help text])
)
# output from ./configure --help
--disable-assert turn off assertions
--enable-foo enable the use of foo through this very long and boring help text
--enable-bar enable the use of bar through this very long and
boring help text
As you can see the text that is not typeset through the use of AS_HELP_STRING is not properly aligned (the distance used in the example is two tabulations, at most sixteen spaces, depending on the length of the option text, which falls short of two spaces).
It's not really important for what concerns the functionality of the script, but it's useful to keep for consistency. Also, this allows software inspecting configure.ac files to identify the available options (for eventual graphical frontends to ./configure or auto-generation of packaging description files (RPM specifications, Gentoo ebuilds, …).
3.2. Automatic Dependencies with AC_ARG_WITH
Sometimes, the external dependencies of a project can be a hassle, especially if they enable optional features that not every operating system supports or that some users don't really care about. For this reason, they are often made optional, non-mandatory.
When the option is non-mandatory, but it's desirable if certain software is present in the system, it's usual to make the dependency automatic. Automatic dependencies are enabled only if the needed libraries are found, and "soft-fail" in disabling the features if they are not. Distributions further specialise this class in automatic and automagic dependencies; this latter name is used for those dependencies that don't allow being overridden, and thus will always enable the features if the libraries are found, and always soft-fail when they are not found. For distributions like Gentoo Linux that build on users' systems, this situation is actually problematic and has to be resolved to properly package the software ([GentooAutomagic]).
To avoid this kind of problem, the best thing is to implement a --with parameter that allows overriding automatic detection: forcing it to yes would make the code fail entirely when the library is not detected, and forcing it to no would make the code skip over the check entirely.
Example 1.2. Using AC_ARG_WITH to declare automatic dependencies.
AC_ARG_WITH([foo],
AS_HELP_STRING([--without-foo], [Ignore presence of foo and disable it]))
AS_IF([test "x$with_foo" != "xno"],
[CHECK_FOR_FOO([have_foo=yes], [have_foo=no])],
[have_foo=no])
AS_IF([test "x$have_foo" = "xyes"],
[do_whatever_needed],
[AS_IF([test "x$with_foo" = "xyes"],
[AC_MSG_ERROR([foo requested but not found])
])
])
Once again, the empty value, and any other value from yes and no are handled together as a default case (that we could call the autocase), and no extra sanity check is added.
The library is checked for unless explicitly requested not to, and the $have_foo variable is set accordingly. If foo hasn't been found, but there was an explicit request for it, an error message is displayed and the configure script stops there.
3.3. Environment Variables as Arguments
The AC_ARG_VAR macro is used to declare a particular (environment) variable as an argument for the script, giving it a description and a particular use. While this feature has been added relatively recently in the history of autoconf, it is really important. Reflecting its more recent presence, the macro does not need the AS_HELP_STRING helper, and only takes two parameters: the name of the variable and the string printed during ./configure --help:
AC_ARG_VAR(var-name, help-string)
By default, configure picks up the variables from the environment like any other sh script. Most of those are ignored. Those that are not should be declared through this macro. This way they are marked as a precious variable.
A variable marked as precious gets replaced in the Makefile.in without having to call an explicit AC_SUBST, but that's not the most important part of the definition. What is important is that the variable is cached.
When running ./configure with the variable set (both when setting it in the environment and when setting it just for the execution via ./configure FOO=bar), its value is saved in either the proper cache file (if requested, see Section 7, "Caching Results") or in the configuration status. This in turn produces two effects:
- the variable is compared for consistency between different cached runs, to avoid re-using an incompatible cache file;
- the variable is saved for re-execution when using ./config.status --recheck (as used by maintainer mode).
4. Finding Libraries
4.1. Searching For System Libraries
When using autoconf for portability, it's necessary to consider that some functions, even standard system ones, are often in different libraries on different operating systems. For instance, the dlopen() function is in libdl on GNU/Linux, and in the standard C library on FreeBSD and other BSD-derived systems.
This is one of the most common things to test, but the commonly-documented solution to this, which involves the use of theAC_CHECK_LIBS macro, leads to either wrong solutions, or over-engineered ones. This macro is used to check for the presence of a known (usually, third-party) library but it does not work that well when you have a list of alternatives to check.
The correct macro to use for this task is AC_SEARCH_LIBS, which is designed keeping into consideration at least two important points:
- There might be no need for further libraries to be added for the function to be available. This may either be because the function is in the C library or because the library that it's found in is already present in the LIBS variable's list. This does not mean the function is present in a library called libc;
- Only one library carrying the function is needed, so testing should stop at the first hit. Testing further libraries might very well lead to false positives and will certainly slow down the configuration step.
Other than these configurations, the interface of the macro is nothing special for autoconf:
AC_SEARCH_LIBS(function, libraries-list, action-if-found, action-if-not-found, extra-libraries)
function
The name of the symbol to look for in the libraries.
libraries-list
A whitespace-separated list of libraries, in library-name format, that have to be searched for the function listed before. This list does not require the C library to be specified, as the first test will be done with just the libraries present in the LIBSvariable.
action-if-found, action-if-not-found
As usual, the macro supports expanding code for success and failure. In this instance, each will be called at most once, and the default action-if-found code, adding the library to the LIBS variables, is always executed, even if a parameter is passed.
extra-libraries
Technically, on some if not most operating systems, it is possible for libraries to have undefined symbols that require other libraries to be linked in to satisfy. This is the case for most static libraries, but it can also happen for some shared libraries.
To make it possible to search in the libraries, the macro provides this parameter. There is an implicit value for the parameter: the LIBS variable, which is always passed at link-time after the value of this parameter. This list is not added to the variable even on success.
It is important to note that if you were to find the symbol in one of these libraries, you'd be hitting the same case as if the symbol is already available in the libraries listed in LIBS.
Using this macro it's possible to ignore in the Makefile the different libraries that are used to provide the functions on different operating systems. The LIBS variable is set up to list all the libraries, hiding the need for anything besides the standard library.
Example 1.3. Looking for two common system libraries with AC_SEARCH_LIBS
dnl The dlopen() function is in the C library for *BSD and in
dnl libdl on GLIBC-based systems
AC_SEARCH_LIBS([dlopen], [dl dld], [], [
AC_MSG_ERROR([unable to find the dlopen() function])
])
dnl Haiku does not use libm for the math functions, they are part
dnl of the C library
AC_SEACH_LIBS([cos], [m], [], [
AC_MSG_ERROR([unable to find the cos() function])
])
4.2. Checking For Headers
The header files describe the public interface of a C (or C++) library. To use a library you need its public headers, so to make sure a library is available, you have to ensure that its headers are available.
Especially in the Unix world, where a shared library already defines its public binary interface or ABI, the presence of headers can tell you whether the development packages needed to build against that library are present.
To make sure that headers can be found properly, autoconf provides two macros: AC_CHECK_HEADER to find a single header orAC_CHECK_HEADERS to look for more than one header at a time (either replacement headers or independent headers).
4.2.1. Checking for one out of Multiple Headers
It's not unlikely that we have to look for one out of a series of possible replacement headers in our software. That's the case when we've got to support libraries that might be installed at top level, in subdirectories, or when older non-standard headers are replaced with new equivalent ones.
A very common case is looking for either one of stdint.h, inttypes.h or sys/types.h headers to declare the proper standard integer types (such as uint32_t) for backward compatibility with older non-standard C libraries.
In this situation, the order used above is also a priority order; since the first is the preferred header and the third is the least favourite one. It doesn't make sense to test for any other headers once the first is found. There also has to be an error message if none at all of those is available.
The macro AC_CHECK_HEADERS provides all the needed parameters to implement just that. The first parameter of the macro is a sh-compatible list, rather than an M4 list, and it's argument to a for call in the configure file and there is an action executed once the header is found (actually, there is one also when the header is not found, but we're not going to use that one.
AC_CHECK_HEADERS([stdint.h inttypes.h sys/types.h],
[mypj_found_int_headers=yes; break;])
AS_IF([test "x$mypj_found_int_headers" != "xyes"],
[AC_MSG_ERROR([Unable to find the standard integers headers])])
In this case, the configure script will check for the headers in sequence, stopping at the first it finds (thanks to the break instruction). This reduces the amount of work needed for the best case scenario (a modern operating system providing the standard headerstdint.h).
Since exactly one of the actions (found or not found) is executed per each header tested, we cannot use the 'not found' action to error out of the script, otherwise any system lacking stdint.h (the first header tested) will be unable to complete the step. To solve this, we just set a convenience variable once a header is found and test that it has been set.
4.2.2. Headers Present But Cannot Be Compiled
Since August 2001, autoconf has been warning users about headers that are present in the system but couldn't be compiled, when testing for them with either of the two macros explained previously.
Before that time, the macros only checked if the header was present by asking the preprocessor for it. While this did find the whereabouts of headers, it included no information regarding their usability. Without checking if the headers compiled, it was impossible to say if the header was valid for the language variety chosen (like C99, or C++), or if it required unknown dependencies.
To solve the problem, it was decided to use the actual compiler for the currently selected language to look for the headers, but for compatibility with the previous version at the moment of writing, both tests are performed for each header checked, and the"header present but cannot be compiled" warning is reported if the preprocessor accepts a header while the compiler refuses it.
There are multiple causes for this problem, and thus to solve this warning, you have to identify the actual cause of it. For reference, you can find some examples here.
Example 1.4. header present but cannot be compiled: wrong language dialect
Note
This example is based on an actual mistake in the KDE 3.5.10 build system for the kdepim package.
One situation where the header files are present but cannot be compiled is when they are not designed to work with a particular language variety. For instance, a header might not be compatible with the C99 dialect, or with C++, or on the other hand it might require C99 or C++.
If that particular language is selected, though, and the header is tested for, the behaviour of rejecting the header is indeed what the developers are expecting. On the other hand, it's possible that the test is being done with a different language than the one where the header is going to be used.
For instance, take the following snippet, that tries to look for the bluetooth/bluetooth.h header from bluez-libs, using a strict C compiler:
AC_INIT
CFLAGS="-std=iso9899:1990"
AC_CHECK_HEADERS([bluetooth/bluetooth.h])
AC_OUTPUT
This will issue the warning discussed above when running configure:
checking bluetooth/bluetooth.h usability... no
checking bluetooth/bluetooth.h presence... yes
configure: WARNING: bluetooth/bluetooth.h: present but cannot be compiled
configure: WARNING: bluetooth/bluetooth.h: check for missing prerequisite headers?
configure: WARNING: bluetooth/bluetooth.h: see the Autoconf documentation
configure: WARNING: bluetooth/bluetooth.h: section "Present But Cannot Be Compiled"
configure: WARNING: bluetooth/bluetooth.h: proceeding with the preprocessor's result
configure: WARNING: bluetooth/bluetooth.h: in the future, the compiler will take precedence
checking for bluetooth/bluetooth.h... yes
The reason for the above warnings can be found by looking at the config.log file that the configure script writes:
configure:3338: checking bluetooth/bluetooth.h usability
configure:3355: gcc -c -std=iso9899:1990 conftest.c >&5
In file included from conftest.c:51:
/usr/include/bluetooth/bluetooth.h:117: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'int'
/usr/include/bluetooth/bluetooth.h:121: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'void'
configure:3362: $? = 1
Looking at those lines in the header file shows that it is using the inline keyword, which is not supported by the C90 language (without extensions). If the header is checked for, though, it means it's going to be used, and even if it does not work with the selected C90 dialect, it might work with the C++ language used by the actual program.
The configure.ac file can then be fixed by changing it to the following code:
AC_INIT
CFLAGS="-std=iso9899:1990"
AC_LANG_PUSH([C++])
AC_CHECK_HEADERS([bluetooth/bluetooth.h])
AC_LANG_POP
AC_OUTPUT
5. Custom Autoconf Tests
While autoconf provides a number of pre-written tests to identify presence of headers, symbols and libraries, they obviously don't cover the whole range of possible situations that software developers might be facing to recognise the environment that their software is being built on.
For this reason, autoconf also provides interfaces to write custom testing routines, which can be divided in two main groups: "build tests" and "run tests", depending on how the test is performed. The former group can then be further split into "pre-processing tests", "compile tests" and "link tests", to denote the different finishing step in the tests' process.
5.1. "Build Tests"
Most of the tests implemented within configure scripts are designed to identify whether the compiler supports a particular syntax, or a given symbol, constant or header file is available at build time. These tests, the generic version of which is available with predefined macros such as AC_CHECK_HEADERS and AC_CHECK_LIB are the so-called "build tests".
Since the predefined tests don't cover the whole range of tests that could be needed (e.g.: they don't provide a way to check for a reported minimum version of an API library), autoconf exports macros that allows to check whether some given source code preprocesses, compiles or links properly: AC_PREPROC_IFELSE (formerly AC_TRY_CPP), AC_COMPILE_IFELSE and AC_LINK_IFELSE.
The three macros share the same interface, which itself follows the usual actions-based behaviour of other predefined macros:
AC_PREPROC_IFELSE(input, [action-if-true], [action-if-false])
AC_COMPILE_IFELSE(input, [action-if-true], [action-if-false])
AC_LINK_IFELSE(input, [action-if-true], [action-if-false])
The three macros have a progressively more strict requirement for the sources they are provided, given that each brings its input a step further in the usual build chain. This means that you don't require proper C code to be passed to the preprocessor, but you have to do so when compiling, whereas then you don't need a main() entrypoint, the link test will require one (as it can only link an executable, not a shared object).
Compared to the "run tests" discussed later on, the "build tests" are safe for cross-compilation, as long as the proper compiler and linker are present in the system.
Even though autoconf makes it available, the use of AC_PREPROC_IFELSE is actively discouraged. When invoked directly, the preprocessor lacks some of the definitions set up by the compiler frontend, and some features might behave inconsistently between the two. For this reason, it is suggested that tests for macro definitions and header presence to be performed usingAC_COMPILE_IFELSE instead.
5.2. "Run Tests"
Sometimes, the mere presence of a function, or of a constant's definition, is not enough for a test to be considered successful. For instance an interface's optional function might be present as a stub returning a "not implemented" error condition, or a constant might be declared but ignored by the functions that are supposed to make use of it. For these reasons, it is sometimes a necessity to execute the code after building it, and wait for its results.
Important
Executing test code in a build scenario can be tricky: the system used to build a package might very well not be the same where the code would be executed (this is the case for most Linux – and not – distributions' build farms) which could lead to erroneous results or, in case the architectures of the two systems are not compatible, an unusable test, which will interrupt the course of the ./configure execution.
For those reasons it is important to make sure that the results of all the tests executed on the build host can be overridden. To do so, the best solution is to cache the results, so that a simple environment variable can be used to skip over the test execution, providing the correct, precalculated value.
The basic macro to support "run tests" is AC_RUN_IFELSE (formerly AC_TRY_RUN and AC_TEST_PROGRAM), which extends the AC_LINK_IFELSEflow by executing the just-linked program, and follows the usual actions paradigm, adding one third case for cross-compilation (when the test code cannot run because of architecture incompatibility).
AC_RUN_IFELSE(input, [action-if-true], [action-if-false], [action-if-cross-compiling])
input
The test program's source code; just like the "build test" macros it has to be provided through AC_LANG_SOURCE or variation thereof. Since this macro can be considered the next step after AC_LINK_IFELSE, the same requirements apply. Additionally, themain() function should return a zero status for success, and any non-zero status for failure, as any other shell program.
action-if-true
This block of M4sh code is executed if the test executable was cleanly executed and returned a zero status. If more verbose results are required out of the test case, the code can execute ./conftest$EXEEXT (literally).
action-if-false
This block of M4sh code is executed if either the test couldn't be compiled, linked, executed or if the executed test returns a non-zero status. The status of the latest command executed is available in $?, but there is no way to discern whether that is the compiler's, linker's or the test's status.
action-if-cross-compiling
Finally, this block of M4sh code is executed if the ./configure script is executed for cross-compilation. The default content of this section causes the script to abort (through AC_MSG_FAILURE), which is the main reason why "run tests" are frown upon on cross-compiling prone environments.
5.3. Tests' Input Sources
All the test macros discussed in the previous sections require properly formatted and saved input sources; to help proper generation of these, autoconf provides the developer with a set of macros, starting with AC_LANG_SOURCE. Starting from autoconfversion 2.68, it is no longer possible to provide sources that are not generated by this family of macros without it reporting a warning. It is possible that future versions will disallow such behaviour altogether.
The AC_LANG_SOURCE macro is the basis for providing sources to the input parameter of the above-described If-Else macros. it only has one parameter, which is the raw sources for the test program in the currently selected language. The generated source file will not only contain the provided sources, but will also include the list of macro definition emitted by calls to AC_DEFINE.
Important
The sources provided in this macro are expanded twice. This means that you have to quote them twice as well when providing them. So for instance, a test with a simple main() function would be declared this way:
AC_LINK_IFELSE([
AC_LANG_SOURCE(
[[int main() { return 0; }]]
)
])
For a matter of pure convenience, autoconf provides a AC_LANG_PROGRAM macro that takes two distinct arguments: a prologueparameter that is used to emit code outside of the main function's body, and a body parameter that is emitted within the main function's body (main() for C and C++, but might differ for other languages).
This macro is especially helpful if your test is designed to only check for compiler or linker flags, as the entry point will be generated by autoconf and will return a non-error condition by default. Otherwise, it is simply a wrapper around the already definedAC_LANG_SOURCE macro.
Note
As of autoconf 2.68, there are a few more wrapper macros around AC_LANG_SOURCE, which are documented for completeness's sake in the official documentation. They will be not named or documented further from here, as their design makes them incompatible with the Erlang language, and their limitation make them unsuitable for the use in modern build systems.
6. Autoconf Building Blocks: Macros
The configuration stages of different software packages are never entirely unique. There are repetitive, similar (if not outright identical) tasks that have to be completed for different packages, with more or less sharing between them.
To avoid repeating the same code over and over, most programming languages provide the facility of functions; m4 instead, being a macro language, provides macros, which share not only the name but also most of the details with the C preprocessor macros: they are expanded inline, they don't have their own variables' scope, the parameters are not typed, and so on.
Note
Confusingly enough, the macro to define new macros is called AC_DEFUN which often confuses the developers into thinking about them in terms of functions. This can lead to many problems, which is why I suggest to always explicitly think and talk about them in terms of macros.
6.1. External Macro Files
Since autoconf macros are often developed to solve generic problems, rather than specific problems of a project (otherwise direct M4sh code would be good enough for most uses), they are often shared across packages and across projects.
In the past, most packages shipped their own macro file with a standardised macro to search for them in a system at build time, making use of particularities of the package, or through configuration helper scripts. For most projects these have been phased out in favour of pkg-config.
There are, though, reusable macros, shipped with various projects or present in archives, such as the Autoconf Archive. Depending on the nature of the macro, the file where it is written is either installed in the system (to be used by autoconf directly) or is simply available to be picked up from the source distribution.
To take two examples, the pkg.m4 file that is shipped with pkg-config is installed in the system, while the attributes.m4 macro file, shipped with xine-lib, PulseAudio and the LScube projects, is simply shared by copying it out of the source distribution or repositories.
When using external macro files to store custom and generic macros (which is, most of the time, the suggested approach), you have to tell autoconf where to look for them. Many different approaches are available for this task, and this guide will try to explain most, if not all, of them.
Note
While there is no functional requirement for that, the guide will assume that all your macro files are inside them4/ directory; this is the most common directory used to keep the macro files, and for the principle of least surprise, you probably want to also put yours there.
Some projects use other directory names (autoconf/, ac-macros/, …) but this often adds more work for the distributors packaging or fixing the software, since they have to check where to find the macros.
6.1.1. With Just autoconf
When not using automake, and just relying on autoconf, the macro files are not picked up by default.
Indeed, if you just added your testing macro in the configure.ac file, you'll be finding it just copied over in the final configure:
% cat m4/test.m4
AC_DEFUN([AUTOTOOLS_MYTHBUSTER], [
AC_MSG_CHECKING([testing])
AC_MSG_RESULT([ok])
])
% fgrep AUTOTOOLS_MYTHBUSTER configure.ac
AUTOTOOLS_MYTHBUSTER()
% fgrep AUTOTOOLS_MYTHBUSTER configure
AUTOTOOLS_MYTHBUSTER()
Indeed, what you have to do is to force the actual inclusion of the macro file in the configure.ac file.
Example 1.5. Including an External Macro File without automake
AC_INIT
m4_include([m4/autotools_mythbuster.m4])
AUTOTOOLS_MYTHBUSTER
The m4_include directive works quite like the #include directive of the C programming language, and simply copies over the content of the file.
The only file that is read by autoconf other than the configure.ac file is the aclocal.m4 file. This file is often managed with the aclocalutility that ships with automake, so it's really suggested not to make use of it manually.
6.1.1.1. What About -I m4?
The autoconf tool has a parameter -I that allows adding a directory to the search path for the conversion. This command is also not used to discover macro files.
What it is useful for is to avoid using the full path name of a macro file, letting it be picked up either from the system or from the local directory (giving priority to the system copy).
AC_INIT
m4_include([pkg.m4])
PKG_PROG_PKG_CONFIG
In this case, the macro file is included with the generic base name value of pkg.m4 instead of m4/pkg.m4. If the macro file is available to the system (in /usr/share/autoconf for instance) the macro will be picked up from there; otherwise, if autoconf -I m4 is used, the one in the m4 directory will be used.
6.1.2. Using AC_CONFIG_MACRO_DIR (and aclocal)
Starting from version 2.58, autoconf provide the macro AC_CONFIG_MACRO_DIR to declare where additional macro files are to be put and found. The argument passed to this macro is commonly m4.
This macro, for the longest time, has been used only by libtool starting from version 2.0, to identify where to copy its own macro files when using libtoolize --copy.
Starting from version 1.13, automake augments autoconf with a macro called AC_CONFIG_MACRO_DIRS, that provides a space-separated list of directories to use for looking up m4 files. The same macro will be available as part of autoconf 2.70.
The list of directories declared in these macros will be used by the aclocal tool to look up the macros called by the configure.ac file. After all the macros (and their dependencies) have been gathered, it will create a aclocal.m4 file that autoconf will use.
% cat configure.ac
AC_INIT
PKG_PROG_PKG_CONFIG
% aclocal
% ls -l aclocal.m4
-rw-r--r-- 1 flame flame 5775 2009-08-06 10:17 aclocal.m4
% fgrep PKG_PROG_PKG_CONFIG aclocal.m4 | grep AC_DEFUN
AC_DEFUN([PKG_PROG_PKG_CONFIG],
% autoconf
% ./configure
checking for pkg-config... /usr/bin/pkg-config
checking pkg-config is at least version 0.9.0... yes
In contrast to what autoconf does, aclocal takes its macro files from the /usr/share/aclocal path, where most software installs them, and copies the macro files where they are defined directly inside aclocal.m4, appending them to one another. Then autoconf reads the file as if it was part of its macros' library.
Local macros will also be looked up, but their content will not be appended to aclocal.m4. Instead, it will use the m4_include directive, to include the local file.
The search path for local files, as of version 1.13 of automake, is defined by the directories listed in AC_CONFIG_MACRO_DIR andAC_CONFIG_MACRO_DIRS arguments. You can also use ACLOCAL_AMFLAGS to pass an extra -I m4 to aclocal, but that behaviour is deprecated, and should not be used.
% cat configure.ac
AC_INIT
AC_CONFIG_MACRO_DIR([m4])
AUTOTOOLS_MYTHBUSTER
% aclocal
% cat aclocal.m4
# generated automatically by aclocal 1.13 -*- Autoconf -*-
dnl […] snip […]
m4_include([m4/autotools_mythbuster.m4])
% autoconf
% ./configure
checking testing... ok
6.1.2.1. The acinclude.m4 file.
In addition to searching its own directory and the include path given on the command line, the aclocal tool takes into consideration another file: acinclude.m4. This file is also copied (rather than included) in the final output of the tool, and then picked up byautoconf.
This file is often used to put together multiple macros from different macro files, without having to use an m4/ directory or equivalent. This usage is discouraged by this guide, because it often leads to overly long files, with no logical distinction between macros.
Once again, this has to be considered an old interface kept for compatibility; the m4/ macro directory with its macro files is the suggested method of adding new macros to a project.
6.2. Once-Expansion
Since macro calls are expanded inline, multiple calls to the same macro will cause more code to be emitted in the final configurescript. In turn, this will require a longer time to execute the configure phase, both because more code is to be executed, and becausebash gets easily slowed by long scripts.
To solve this problem, a subset of macros can be called through the so-called "Once-Expansion". Such a macro is usually immune to most of the changes in the current environment, so that the place in the configure script where it is called is not important for the successful completion of the configure phase.
Of course, this comes with a price: once macros might not be called conditionally, and they lack configurable side effects on success or failure (they can have standard side effects like setting variables and cache values, and generating definitions).
To create a once macro, you just define it almost normally, but using the AC_DEFUN_ONCE definition macro. A macro created this way should not make use of parameters that can change between calls; it either has to take parameters used to identify the project or the options to enable (think AC_INIT) or none at all.
6.2.1. Once-expanded checks
Similarly to once-expanded macros, recent autoconf versions provide the so-called "once-expanded checks" for functions, headers, declarations, …
The use of these macros is slightly different from the standard checks, since they follow, for the most part, the same rules as once-expanded macros: with the exclusion of the parameter with the list of elements to check for, there can be no parameters executing side-effects or changing the default behaviour.
For instance when checking for a series of headers with once-expansion, you'd just write it as this:
AC_CHECK_HEADERS_ONCE([header1.h header2.h header3.h])
In this case you cannot stop when the first of these is found, you cannot error out when one is not found either, and finally you cannot pass further headers, not even to fix problems with headers present that cannot be compiled.
Like once-expanded macros, once-expanded checks are expanded as early as possible, bundled together, and at most one time. This reduces the amount of code generated, especially if multiple code paths would check for the same header multiple times.
7. Caching Results
For the way autoconf is designed, some tests are quite expensive, in terms of the required work from the system running theconfigure script. For this reason, in the design of autoconf there is place for a caching system.
This system provides caching both for the current run (so that the same check in multiple code paths will not require execution of the same test twice) and on-disk caching for multiple runs. At the same time it's also often used to provide fake values to sidestep tests that might lead to wrong results.
Warning
The use of this feature is designed for a very narrow use case, as you can see in Section 7.1, "Why Caching is Not What You're Looking For". Please think twice before deciding to make use of this technique when running a configure script.
If you're writing a configure.ac file, though, read on, and follow this advice for properly caching the values, as having cached tests available becomes pretty useful especially when debugging mistakes.
7.1. Why Caching is Not What You're Looking For
Often enough, ideas appear of using the caching system of autoconf to avoid repeating the same tests when configuring different projects. This approach is dangerous and unstable for very good reasons.
The first problem is that, while the caching system does provide some sanity checks to ensure that the user hasn't changed settings like compiler and flags used to build between different calls; it does not verify that the tests are executed under the same conditions. For instance, it does not take into consideration the current selected language dialect.
As an example, take two almost identical configure.ac files available as examples of this guide: a default C variant and a C99 variant. These two scripts only check for thisonlyworksonc99.h simply changing the language dialect (not even the full language) changes the results coming from the two different runs:
whynot % ./configure.c CPPFLAGS="-I."
[ ... ]
checking thisonlyworksonc99.h usability... no
checking thisonlyworksonc99.h presence... no
checking for thisonlyworksonc99.h... no
configure.c: creating ./config.status
whynot % ./configure.c99 CPPFLAGS="-I."
[ ... ]
checking thisonlyworksonc99.h usability... yes
checking thisonlyworksonc99.h presence... yes
checking for thisonlyworksonc99.h... yes
configure.c99: creating ./config.status
Because of the way the header is designed, it will only be found when C99 is used (remember that starting from autoconf 2.64, the "usability" test is the dominant one; see Section 4.2, "Checking For Headers"), but if you were to use the same cache file for both scripts, you'd be having some funky, unstable results:
whynot % ./configure.c CPPFLAGS="-I." -C
configure.c: creating cache config.cache
[ ... ]
checking thisonlyworksonc99.h usability... no
checking thisonlyworksonc99.h presence... no
checking for thisonlyworksonc99.h... no
configure.c: updating cache config.cache
configure.c: creating ./config.status
whynot % ./configure.c99 CPPFLAGS="-I." -C
configure.c99: loading cache config.cache
[ ... ]
checking for thisonlyworksonc99.h... (cached) no
configure.c99: updating cache config.cache
configure.c99: creating ./config.status
whynot % rm config.cache
whynot % ./configure.c99 CPPFLAGS="-I." -C
configure.c99: creating cache config.cache
[ ... ]
checking thisonlyworksonc99.h usability... yes
checking thisonlyworksonc99.h presence... yes
checking for thisonlyworksonc99.h... yes
configure.c99: updating cache config.cache
configure.c99: creating ./config.status
whynot % ./configure.c CPPFLAGS="-I." -C
configure.c: loading cache config.cache
[ ... ]
checking for thisonlyworksonc99.h... (cached) yes
configure.c: creating ./config.status
As you can see, autoconf does not validate whether the cache comes from the same configure script, nor if the same compiler options are enabled in both runs.
It doesn't stop here though, since you can write your own tests, you can easily re-use the same cache value variable name for very different meanings, and that will also produce bogus results. In turn bogus results can create multi-layer failures that are also difficult to debug unless it is known that the cache was polluted.
To make the matter worse, a common but subtle problem with cache pollution is related to pkg-config, and the fact that it allows the developer to choose any given keyword to check for any package using the same name; like NEEDED or COMMON, to look for different libraries in different projects also deems the cache to be unusable between them.
Bottom-line suggestion is that you should make use of caching sparingly, when developing code, without touching the configure.acfile nor changing the environment, which is, indeed, a very limited use of caching.
Chapter 2. Harnessing Make — automake
1. Available options
There is a series of options that can be given to automake, to change its default behaviour to something that more suits the need of the single project.
These options are usually passed to the AM_INIT_AUTOMAKE macro, but most can also be passed via the variable AUTOMAKE_OPTIONS in the top-level Makefile.am. In both representation, it is just a space-separated list of tokens, described further on.
Since not all the options can be passed through Makefile.am, it is recommended to always pass the options through a call toAM_INIT_AUTOMAKE.
Example 2.1. Passing simple automake options
AM_INIT_AUTOMAKE([1.11 silent-rules])
Since each option (or option group) might require a different explanation in different contexts, you can use the following list as a reference for them.
1.8, 1.9, …, 1.12
The minimum (minor) version of automake needed to properly rebuild the current project.
This is usually only used to indicate that the project makes use of features that have been introduced with a given version, which means that it doesn't list micro/patch-level versions (such as 1.12.4), nor can it be used to stop newer versions ofautomake from rebuilding the project.
gnu, cygnus, gnits, foreign
Choose the "flavour" to use, which specifies some further particular options and warning levels for the current project. SeeSection 1.1, "Automake flavours".
silent-rules
Allows the use of Linux kernel-style "silent" output when building. See Section 3, "Silent Building with Automake". As ofautomake 1.13, this option is implied by default, and is a complete no-op.
subdir-objects
Compile source files in the relative sub-directories, rather than in the directory of the Makefile.am that declares them. SeeSection 2, "Non-recursive Automake".
dist-gzip, dist-bzip2, dist-xz, dist-zip, …
Defines the archive format for the distribution of sources, as generated by make dist.
-Wgnu, -Wportability, …, -Werror
Enable (or disable) warning categories in automake. These options correspond to the warning options available on automakecommand line.
1.1. Automake flavours
Among different projects, automake is used with different options and settings. Some of the most important projects have their own flavour settings supported by automake directly, as a single option, these are gnu (the default), cygnus (removed in automake 1.13), gnits and the final foreign which is meant as a "none of the above" option.
Out of these, you generally ought to employ only two: the default settings (which are gnu) and the foreign setting. Most of your projects are likely to use the latter, even though that is not the default, because it relaxes some checks that are, otherwise, often worked around in tutorials.
In particular, the gnits standard is non-existent, only exists as an historical reference in the GNU web site – it may as well not exist at all for the scope of this guide. Similarly the cygnus flavour, used by hierarchical tree projects such as GCC, has been deemed obsolete starting from automake 1.12 and is no longer accepted starting version 1.13, so it's similarly ignored by the rest of the guide.
The two main differences between the gnu and foreign flavours is that the former requires the presence of a number of files in the top-level of the projects, such as NEWS, COPYING, AUTHORS, ChangeLog, README. Often enough, at least the first file in this list is just touch-ed to stop automake from failing.
Note
Even if you plan on using these files the way GNU does, it is still recommended to use the foreign flavour, and manually list these files in Makefile.am so that they are actually installed in the correct place; the gnu flavour only requires them to be distributed, not to be actually installed.
Another important setting that changing the flavour achieves is disabling some of the portability warnings, letting you cleanly use the GNU make extensions, which makes it much cleaner, nicer and faster to write proper build systems.
2. Non-recursive Automake
One of the common criticisms of automake is that most projects that make use of it use multiple Makefile.am files, one per sub-directory of the project, causing pollution, confusion and further problems. For quite a while, though, automake supports non-recursive builds, with a single Makefile.am (or at least a reduced number of those).
This is, actually, the best method to use with automake, since this not only covers the problems discussed in [MillerRecursiveMake] but also avoids the need for multiple convenience libraries to store the partial results.
To summarise, the advantages of non-recursive make over the recursive "classic" version are:
- make knows all the dependencies of all the files, and thus only rebuilds object files when their sources (or the headers used by the sources) have changed;
- the sources from the sub-directories are compiled and linked in directly in the final target, without requiring convenience libraries to be created, which spend time for the archiving or linking tasks;
- as an extension, make doesn't have to serialize the calls to sub-directories, which allows for a higher number of build processes to be executable in parallel, if so requested; this is particularly important for modern multi-core systems;
- the entire builds working directory is not changed, which allows for a single point of reference for relative paths;
- there is a single Makefile.am to edit, and a single Makefile.in to be processed during ./configure.
2.1. Achieving Non-Recursive Make
The use of non-recursive automake is actually simpler than the use of the recursive variant; instead of creating multiple files, you just need a top-level Makefile.am file that references source files with a relative path:
bin_PROGRAMS = foo
foo_SOURCES = \
src/component1/component1.c \
src/component2/component2.c \
src/main.c
This will compile all the source files in object files directly inside the top build directory; this works fine for most cases, but it might not be desirable if either the source tree is very big (and thus a high number of files would be added to the same directory) or there are source files with the same name minus the path.
To solve this problem, you just have to ask automake to create objects in sub-directories, following the same structure the sources are in. To do so, you just have to change the AM_INIT_AUTOMAKE call in configure.ac and add the option subdir-objects:
AM_INIT_AUTOMAKE([subdir-objects])
For most needs, this will solve the problem of non-recursive automake just fine, and no more tweaks will be needed. For more specific cases, check the following sections as well.
2.2. Sub-dividing results
While the previous section has shown the use of subdir-objects to keep the object files in the same structure as the source files, it has only declared the main program to be built in the top-level build directory. Sometimes this is not the needed behaviour.
It is certainly common to desire some organisation of the build products. For grouping together libraries, tools, examples and tests for instance and automake allows that without having to resort to the recursive variant.
Example 2.2. Grouping tests together in a non-recursive Makefile.am
FOO_BUILT_TESTS = tests/foo_test1 tests/foo_test2 tests/foo_test3
TESTS = $(FOO_BUILT_TESTS) tests/foo_test4.sh
check_PROGRAMS = $(FOO_BUILT_TESTS)
tests_foo_test1_SOURCES = tests/foo_test1.c
tests_foo_test1_LDADD = libfoo.la
tests_foo_test2_SOURCES = tests/foo_test2.c
tests_foo_test2_LDADD = libfoo.la
tests_foo_test3_SOURCES = tests/foo_test3.c
tests_foo_test3_LDADD = libfoo.la
In the fragment of Makefile.am above, you can see that the tests for the package are listed with their relative path even in thecheck_PROGRAMS variable, where the output names are used.
Further down the file, the variables used for passing sources and libraries to the tests also use the full relative path, replacing the /character with the safe value for variable names _.
2.3. Custom Rules and Non-Recursive Makefile.am
When using custom rules to generate files, there are a few problems to be considered. Rules that don't use full relative paths for targets and dependencies could be fouled up by stray files left around. Take for instance the following snippet:
pkgconfigdir = $(libdir)/pkgconfig
pkgconfig_DATA = pkgconfig/foo.pc pkgconfig/foo-bar.pc
%-bar.pc: %.pc
$(LN_S) $^ $@
If a Makefile.am is using the code above, it would fail, creating a symbolic link that also contains the relative path: pkgconfig/foo-bar.pc → pkgconfig/pkgconfig/foo.pc.
To avoid this kind of problem, you can make use of GNU make extended functions in the rules, to transform the path from full-relative form to base form (without the path). For instance the fragment above should be replaced by the following:
pkgconfigdir = $(libdir)/pkgconfig
pkgconfig_DATA = pkgconfig/foo.pc pkgconfig/foo-bar.pc
%-bar.pc: %.pc
$(LN_S) $(notdir $^) $@
3. Silent Building with Automake
The "classical" Makefiles generated by automake up to and including version 1.11 have always been very verbose, during build, printing the full command line of every stage. Albeit this is very helpful during debugging, the practise has been criticised, especially as the Linux kernel, and other build systems, defaulted to a "silent rules" approach.
To overcome this criticism, starting from version 1.11 a new option has been made available to generate Makefiles that can actually build with the new silent rules, even though, by default, the old, verbose builds are used.
test-hellow % make V=1
make all-am
make[1]: Entering directory `/home/flame/test-hellow'
gcc -DHAVE_CONFIG_H -I. -g -O2 -MT hellow.o -MD -MP -MF .deps/hellow.Tpo -c -o hellow.o hellow.c
mv -f .deps/hellow.Tpo .deps/hellow.Po
gcc -g -O2 -o hellow hellow.o
make[1]: Leaving directory `/home/flame/test-hellow'
test-hellow % make V=0
make all-am
make[1]: Entering directory `/home/flame/test-hellow'
CC hellow.o
CCLD hellow
make[1]: Leaving directory `/home/flame/test-hellow'
For automake 1.11 and 1.12, a double opt-in is necessary, as it has to be told to generate the silent-capable Makefiles first, after which it's possible to enable said silent rules at the time of build. This has been done because the generated rules are only compatible with the GNU implementation of make, which meant that it would break portability and, for that reason, disable the portability warnings.
The first opt-in is to enable the behaviour, which is done inside configure.ac in either of two methods:
- passing the silent-rules option at the call to AM_INIT_AUTOMAKE as is done with many other options;
- using the AM_SILENT_RULES macro directly after the initialisation;
As of version 1.13, though, this opt-in is no longer necessary, as all the Makefiles are generated to support them. The silent-rulesoption is now a no-op, doing nothing at all, in particular not silencing the portability warnings.
In this guide, the recommended way has always been to use the explicit macro; this has two main advantages: the first is that you can easily make the call conditional to the actual presence of the macro, keeping backward compatibility with automake versions preceding 1.11; the second is that the call can also change the default behaviour of the Makefiles.
Indeed, whether you just enable the feature through either AM_SILENT_RULES, or the silent-rules option, or even if you're usingautomake 1.13 or later, the default built output will not be silent. The silent rules are turned off by default, and the user has to enable them when building with one of the following two methods:
- running ./configure --enable-silent-rules that enable the silent rules by default;
- running make V=0 to disable the "verbose" build;
As was noted above, it is possible to change the default, and make all builds silent unless otherwise requested, through either./configure --disable-silent-rules or make V=1. To do so, you have to pass the value yes to the AM_SILENT_RULES macro.
AM_SILENT_RULES([yes])
3.1. Silent Rules and Backward Compatibility
Since the feature of building with silent rules is only available starting from automake 1.11, enabling the feature in configure.ac, through either the macro call or the silent-rules option is going to stop all the older versions from building the project.
If this is actually a desirable situation, you can also add the 1.11 token to AM_INIT_AUTOMAKE to declare that at least version 1.11 ofautomake is needed for regeneration. See 1.8, 1.9, …, 1.12 .
If, instead, backward compatibility is required, for instance because there are some systems where the code is tested during development that don't have a new enough automake yet, it's quite simple to implement the silent rules support conditionally, by using explicitly the AM_SILENT_RULES macro.
Example 2.3. Using Silent Rules Without Forcing Automake 1.11
AM_INIT_AUTOMAKE([foreign])
m4_ifdef([AM_SILENT_RULES], [AM_SILENT_RULES])
This fragment (the actual important part is the call to m4_ifdef, but it has to go after AM_INIT_AUTOMAKE, will call the silent rules macro only if it's actually defined. On older automake versions, this will not be defined and the whole macro will be skipped.
Note
While this allows for backward compatibility, it is suggested never to keep an overly long backward compatibility, as that increases the number of workarounds and tricks needed to avoid breaking the older versions, as features are implemented and made use of.
As of today, this trick should probably be used only to keep backward compatibility with very old systems where only automake 1.10 is available.
3.2. Custom Silent Rules
While automake has support for silencing all its default rules, when using custom rules you end up outside the scope of the provided support. Adding support for silent rules to custom rules is not exceedingly difficult.
The code that hides the actual command and just replaces it with the CC string is exported in the form of the variable AM_V_CC, and so on replacing the output string. Since most custom rules are used for generating extra files, the AM_V_GEN variable is also available.
Just prefixing the correct variable expansion in front of the rule is enough to support silent rules; the same method explained above allows for selecting the verbose output or the silent one for the custom rules.
Example 2.4. Silent Custom Rule to Generate a File
%-bar.pc: %.pc
$(AM_V_GEN)$(LN_S) $(notdir $^) $@
4. Parallel Building Tricks
Due to the high focus on parallelisation by modern processor manufacturers, it's extremely important to make sure that your build system supports parallel building, testing and install, especially for the sake of distributions such as Gentoo Linux and FreeBSD ports, which build software from source.
While the default rules for automake are properly designed to allow for the highest level of parallelisation, there are a few important details that have to be considered to make your build system properly parallelisable.
The first rule of thumb is to make use of the non-recursive features discussed in Section 2, "Non-recursive Automake". Since makecan only run rules in parallel that are in the same directory, while directories are built serially, by moving everything in a singleMakefile.am you can run everything in parallel.
4.1. Parallel Install
Parallelising the make install process is something that is often overlooked simply because it's an I/O-bound task, rather than a CPU-bound one. Unfortunately, in some cases, libtool will have to perform again the linking on libraries, if the destination folders don't match those used during build, for whatever reason. Since linking is a CPU-bound task, running the install phase in parallel can save you time on multi-core systems.
There are very few issues that you need to consider when dealing with parallel install, as the only tricky part is handling of custom install targets, such as install-exec-local. It's common when writing these targets, to assume that the target directory has already been created. This would be correct both when the targets are executed in series (as the local targets are executed after the main ones by default) and when not using the DESTDIR (as most of the time the directory is already present on the live filesystem).
Example 2.5. Common case of broken install-exec-local target (directory assumed to be present)
bin_PROGRAMS = multicall
install-exec-local:
cd $(DESTDIR)/$(bindir) && \
$(LN_S) multicall command1 && \
$(LN_S) multicall command2
In this case, the multicall executable changes its behaviour depending on the name it has been called as. The build system intends to create multiple symlinks for it during install, but the first call to cd is likely going to fail during a parallel make install execution.
There is only one real way to solve these situations, and that is making sure that the directory exists before proceeding; a common mistake in this situation is to test whether the directory exists, and then calling mkdir to create it. This will also fail, if by reason of parallel execution, the directory is created after the test, but before mkdir.
Example 2.6. Common case of broken install-exec-local target (directory created on a race condition)
bin_PROGRAMS = multicall
install-exec-local:
test -d $(DESTDIR)/$(bindir) || mkdir $(DESTDIR)/$(bindir)
cd $(DESTDIR)/$(bindir) && \
$(LN_S) multicall command1 && \
$(LN_S) multicall command2
This tries to solve the issue noted in the previous example, but if the Makefile.am is complex enough, parallel targets execution can likely cause $(bindir) to be created after the test, before the mkdir.
All modern mkdir implementations, though provide the option -p which not only creates the directory's parents, but also will consider it a success if the directory exists already, contary to its default behaviour.
To make use of mkdir -p, one has to make sure it is supported by the current operating system; autoconf provides a simple way to test for its presence, as well as a replacement script if that wouldn't be enough, via the macro AC_PROG_MKDIR_P. After calling that macro from you configure.ac file, you can then make use of $(MKDIR_P) to transparently call the program or the replacement script.
Example 2.7. Correct install-exec-local using AC_PROG_MKDIR_P
bin_PROGRAMS = multicall
install-exec-local:
$(MKDIR_P) $(DESTDIR)/$(bindir)
cd $(DESTDIR)/$(bindir) && \
$(LN_S) multicall command1 && \
$(LN_S) multicall command2
5. Maintainer mode and the missing script
It is common, both during the development of a project, or during the packaging phase, to edit the source files of the build system,configure.ac, Makefile.am and so on. Depending on which files are modified, it is possible that one or more tools of the Autotools stack need to be executed to produce a new build system, or part of it.
To facilitate the work of both developers and users, automake makes available what is called maintainer mode, a set of rules that regenerate the build system out of its source files, when the modification times no longer match. So if you edit Makefile.am,automake will run, and then ./config.status will recheck Makefile.in to produce the final Makefile.
At the centre of these rules is the missing script, which is usually copied over when generating the build system. This script is designed to check for the presence of, and execute, a given tool. When this fails, because the tool is not present, or for other reasons not compatible, it will warn the user, and mangle the timestamps, as if the tool did its job.
The tool has, of course, limitations; if a tool is missing, which is required to build either an intermediary, or final target which is not already present in the source tree, the build cannot proceed. Another limitation is that, while it does not limit itself to the Autotools stack proper, it only includes support for a handful of tools, and it's not possible to extend it above and beyond those without implementing it in a new version of Automake.
The missing script has been designed to work for source trees where the generated build system is wholly committed to a source control manager, or, conversely, on source trees coming from a distribution tarball where all the files have been already generated once. As committing generated files is generally not recommended, the second case is the one covered in this guide.
As mentioned earlier, by default, generated Makefile will include rules that execute autoconf and automake through missing is either of their source file has a later modification time than their output. Another common situation is to use it with help2man: a man page is generated when creating the distribution; users who have the tool installed can ensure it is kept up-to-date with the command itself, but at the same time those without the tool installed will not be stopped because of the make rule.
5.1. Execution of ./configure
When configure.ac is modified, and the maintainer mode rules cause a regeneration of the configure script, they also need to execute it, to make sure that the changes take effect, this is achieved through the ./config.status --recheck command. Usually, this also leads to the re-execution of a number of other tools including automake.
The parameters passed to the original ./configure call are cached in the generated Makefile and are used again, together with any variable that was marked as precious (see Section 3.3, "Environment Variables as Arguments").
If configure.ac is untouched, but the files that are substituted at the end of ./configure changed (by themselves, or because ofautomake), a faster ./config.status command is executed instead. This will re-generate the files that go through the final substitution without executing all the previous checks.
5.2. Disabling maintainer mode
While this configuration works out quite well to protect against clock skews on pristine, properly generated source archives, often times it leads to unexpected behaviour when the archive is not generated through make dist, as there is no guarantee that the generated build system is up to date with its sources.
A similar problem happens to distributions' packagers: if a patch is applied that modifies the build system, you need to ensure that it gets regenerated fully. Simply relying on maintainer mode does not always fit the process, as the wrong version of automake might be present, causing the missing script to just touch the files, without regenerating them. Even when explicitly running the Autotools stack, there has been corner cases where maintainer mode got in the way, mainly due to timestamp skew.
A macro called AM_MAINTAINER_MODE exists that controls the behaviour of the self-regenerating rules. If no call to the macro is present, maintainer mode is enabled, and it's not possible to disable it at the configuration time. If a call is present without any parameter, maintainer mode is disabled by default and can be re-enabled with ./configure --enable-maintainer-mode.
The suggested call would be like in the following example, as that enables maintainer mode by default (which does the right thing for developers, and for power users), but allows packaging software to disable the automatic rules, which would only be hindering the process.
Example 2.8. Suggested configuration for automake maintainer mode
AM_INIT_AUTOMAKE([foreign])
AM_MAINTAINER_MODE([enable])
Chapter 3. Building All Kinds of Libraries — libtool
While autotools are vastly considered black magic, because they tend to show a black box abstracting lots of details about compilers, makefiles and dependencies, the component that can be truly said to be black magic is certainly libtool.
The libtool script is a huge script that allows the developers to ignore, at least partially, all the complex details related to shared objects and their creation. This is needed because this is one of the areas that changes most between operating systems, both in terms of actual technical requirements and in terms of conventions.
Note
Among the different conventions between different operating systems, the first one to disambiguate is the name used to identify these types of files; almost every other operating system changed the name used.
Unix systems classically called these shared objects and this is the name that will be used in this guide. People more involved in another operating system might know them as "dynamically-linked libraries" (DLL) or simply "dynamic libraries".
Some (older) operating systems might not provide any support for shared objects at all; others might require they be built in a specific (non-default) file format. Some architectures insist that shared objects are built with Position-Independent Code (a feature enabled by the compiler). For those operating systems that do support them, the versioning rules might vary enormously.
All these details make it difficult to simply build shared objects in the same way on all operating systems, and here is where libtoolenters the scene: the libtool script is a frontend to various compilers and linkers that takes care of abstracting most of the syntax, and of the details for building the libraries.
Because of these changes, the libtool package is probably the most complex part of the autotools stack, and also one that is often avoided by those looking for a simpler build system.
Warning
OpenBSD is known as of July 2012 to use their own implementation of libtool which is not 100% compatible with the original GNU implementation.
Unless otherwise stated, all the documentation in this guide and in particular in this chapter is discussing the original GNU projects and packages, and might or might not apply to OpenBSD's own versions.
1. libtool wrappers
When building programs with the help of libtool, your targets in the build directory consist of shell scripts, instead of the final ELF executable files. These shell scripts are designed to workaround some limitations present when working with shared objects in different operating systems.
The most important problem is that, if one of the executable files is referencing a library that was just built, and is not installed in the system already, you have to tell the dynamic loader where said library has to be found — this is even more important when you are building a new version of a library that is already installed on the system. While some operating systems look on the executable's directory for libraries to load, most Unix systems do not, so to point the loader to the new libraries, you have to either use the rpath feature, or you have to set some special environment variables, the name and semantics of which depend on the operating system itself.
Both methods have advantages and disadvantages: rpath has the least overhead during building, but it can (generally) only be set at build time by the link editor, and, since it shouldn't be present on the final, installed executable, usually requires relinking all of the files before installation, which is time-consuming.
On the other hand, using the script approach causes headaches with most debuggers (even though there is a designed libtool --mode=debug command): when using the wrapper script, your output file becomes, as said above, a POSIX sh script, while the actual linked executable file is generated within the .libs directory. For instance if your target is called foo, the actual executable will be.libs/lt-foo.
Obviously, the wrapper/launcher script adds some overhead over the startup of the files themselves; since sometimes you're building test programs that are executed many times on the same build process, libtool provides you with a system to disable the wrapper scripts and instead optimise the output to be executed in place. This is done by providing the -no-install flag to the LDFLAGSlist.
Example 3.1. building a test program optimised for in-place execution
check_PROGRAMS = test1
test1_SOURCES = test1.c
test1_LDADD = libmylib.la
test1_LDFLAGS = -no-install
When using -no-install, libtool tells the linker to set the rpath for the output file to the full path of the .libs directory, instead of using a wrapper script to set up the LD_LIBRARY_PATH. This, in turn, eases the debugging phase, as tools such as gdb can be used directly.
While the -no-install option could be very useful when building a work-copy of a project to be tested, and especially debugged, some non-GCC compilers, as well as GCC 4.6 and later, throw an error when they are given unrecognized command-line flags, making it impossible nowadays to pass the flag down to ./configure to build a non-wrapper copy of a project.
2. Building plugins
The shared objects technology is used, among other things, to provide the so-called "plug-in system", that allows us to link in compiled code at runtime providing (eventually optional) features.
To implement plug-in systems, you usually need to call the dynamic linker at runtime to ask it to load the plug-in's shared object. This object might just be a standard shared object or might require further details to be taken into consideration.
The call into the dynamic linker also varies for what concerns interface and implementation. Since most Unix-like systems provide this interface through the dlopen() function, which is pretty much identical among them, lots of software relies on just this interface, and leaves to libtool the task of building the plugins.
Software that is interested in wider portability among different operating systems will be interested instead in using the wrapper library and interface called libltdl.
2.1. Using libltdl for plug-ins
2.1.1. Linking, bundling, installing libtldl
Because of the wide adoption of libltdl in many types of applications, its support in autotools is available with great flexibility. This is because it's wrapping abilities can easily be used on systems where libtool proper is not usually installed, and thus it's often convenient to have a local copy of it.
But with bundling libraries, problems ensue, and it can especially be a problem to choose between bundling a local copy of the library or just using the system one. The macros provided by libtool, even the recent version 2, support three styles for bundlinglibltdl: sub-configured directory, non-recursive inline build, or finally recursive inline build.
As well as these three options, there is also the more "standard" option of simply requesting the presence of the library in the system, as is done for any other dependency and checking for it. This method is neither assisted nor well-documented by the libtoolmanual and is thus rarely used.
For all three bundling styles as provided by libtool, the reference macros in the configure.ac file are LT_CONFIG_LTDL_DIR and LTDL_INIT. When using the sub-configured option, these two are the only two calls that you need. When using the inline build, you need some extra calls.
Example 3.2. Buildsystem changes for bundled libltdl
In configure.ac, for the various cases, commented
# automake needed when not using sub-configured libltdl
# subdir-objects only needed when using non-recursive inline build
AM_INIT_AUTOMAKE([subdir-objects]
# the inline build *requires* the configure header, although the name
# is not really important
AC_CONFIG_HEADERS([config.h])
# the inline build *requires* libtool with dlopen support
LT_INIT([dlopen])
# find the libltdl sources in the libltdl sub-directory
LT_CONFIG_LTDL_DIR([libltdl])
# only for the recursive case
AC_CONFIG_FILES([libltdl/Makefile])
# choose one
LTDL_INIT([subproject])
LTDL_INIT([recursive])
LTDL_INIT([nonrecursive])
The changes for Makefile.am (or equivalent) are trivial for the sub-configured and recursive options (just add the new directory toSUBDIRS), but are a bit more complicated for the non-recursive case. The following is a snippet from the libtool manual to support non-recursive libltdl inline builds.
AM_CPPFLAGS =
AM_LDFLAGS =
BUILT_SOURCES =
EXTRA_DIST =
CLEANFILES =
MOSTLYCLEANFILES =
include_HEADERS =
noinst_LTLIBRARIES =
lib_LTLIBRARIES =
EXTRA_LTLIBRARIES =
include libltdl/Makefile.inc
Whatever option you choose to follow at this point, you must actually bundle the sources in your tree. You probably don't want to add them to your source control system, but you want to add the libtoolize --ltdl command to your autogen.sh script or similar.
As the title of this section suggests, you can technically even install the libltdl that you just built. This is not enabled by default, and rightly so (you'd be installing unrequired software outside of the scope of the build process). The reason why this is at all possible is that the macros used by the libtool package are exactly the same as is provided to third-party developers.
Finally there is no provided macro to check for the library in the system to rely on; since it also does not provide a pkg-configdatafile. The best practices choice is simply to discover the library through AC_CHECK_LIB.
To do that you can use the following snippet of code, for instance:
Example 3.3. Checking for libltdl
AC_CHECK_HEADER([ltdl.h],
[AC_CHECK_LIB([ltdl], [lt_dladvise_init],
[LIBLTDL=-lltdl], [LIBLTDL=])],
[LIBLTDL=])
It's important to check for a function that is present in the currently supported version of libltdl. This snippet checks for thelt_dladvise_init function that is a new interface present in libtool 2.2 and later.
2.2. Building plug-ins for dlopen()
When building plug-ins that are to be used directly with the dlopen() interface (or equivalent) and not through the libltdl interface, you usually just need the shared object files, without versioning or other frills. In particular, given the plug-ins cannot be wrapped statically, you don't need to build the static version at all.
For this reason when building these very 'casual' types of plug-ins, we just rely on three flags for the libtool script:
-module
Ignore the restriction about the lib- prefix for the plug-in file name, allowing free-form names.
-avoid-version
Allow the target to not provide any version information, removing the need to provide it. Almost all the plug-in systems don't use the library version to decide whether to load the objects, and rely instead on the path they find.
-shared
Disable entirely the build of the static version of the object, this reduces the number of installed files, as well as avoiding the double-build that would be needed for all the systems where static libraries and shared objects have different build requirements.
Note
This option will make the package incompatible with the --disable-shared option during ./configure call, as well as stopping the build when shared objects are not supported at all.
-export-dynamic
The current object's exposed symbols have to be accessible through dlsym() or equivalent interfaces.
See Section 3.1, "-export-dynamic".
Example 3.4. Building plug-ins for dlopen() usage
pkglibdir_LTLIBRARIES = foo_example_plugin.la
foo_example_plugin_la_SOURCES = example.c
foo_example_plugin_la_LDFLAGS = -avoid-version -module -shared -export-dynamic
2.3. Exposing fixed and variable plugins' interfaces
When designing plugin interfaces you have two main choices available: either you use a fixed interface or a variable one. In the former case, all plugins export a set of symbols with a pre-selected name, independent of the names of the plugins themselves. With the latter option, the symbols exported by each plugin are instead named after the plugin. Both alternatives have up- and downsides, but these are another topic altogether.
For instance, a fixed interface can consist of three functions plugin_init(), plugin_open() and the plugin_close(), that need to be implemented by each plugin. On the other hand, in the case of a variable interface, the foo plugin would export foo_init(),foo_open() and the foo_close().
Depending on which of the two alternative solutions is chosen, you have alternative approaches to tell the link editor to only show the interface symbols and nothing else, as delineated in Section 3.2, "-export-symbols and -export-symbols-regex", and exemplified below.
Example 3.5. Exposing symbols for plugins with fixed interface
Since each plugin with a fixed interface exports the same set of symbols, and such interface is rarely extended or reduced, the easiest solution here is to use the static -export-symbols option with a fixed list of symbols:
AM_LDFLAGS = -avoid-version -module -shared -export-dynamic \
-export-symbols $(srcdir)/plugins.syms
pkglib_LTLIBRARIES = foo.la bar.la baz.la
foo_SOURCES = foo1.c foo2.c foo3.c
bar_SOURCES = bar1.c bar2.c bar3.c
baz_SOURCES = baz1.c baz2.c baz3.c
Example 3.6. Exposing symbols for plugins with variable interface
When the interface is variable, it is common to have either a prefix or a suffix with the name of the plugin itself; you could then use a generic list of symbols and produce its specific symbol list, or you can make use of -export-symbols-regex with a wider match on the interface.
One of the downsides of using this method is that you then have to carefully name the functions within the plugin's translation units, but the build system should not be used to add workarounds for badly written code.
AM_LDFLAGS = -avoid-version -module -shared -export-dynamic \
-export-symbols-regex '^([a-zA-Z0-9]+)_(init|open|close)$$'s
pkglib_LTLIBRARIES = foo.la bar.la baz.la
foo_SOURCES = foo1.c foo2.c foo3.c
bar_SOURCES = bar1.c bar2.c bar3.c
baz_SOURCES = baz1.c baz2.c baz3.c
3. Exposing and Hiding Symbols
For shared objects to be helpful, both in case of dynamic libraries and plugins, they have to expose (or export) an interface to the other programs. This interface is composed of named functions and data symbols, that can be accessed either by linking to the library or by using dlsym() and similar interfaces provided by the runtime loader.
Not all symbols defined within a shared object need to be exported, though. In the vast majority of cases, a dynamic library will provide a set of symbols corresponding to its public API, while for plugins, the interface to be exposed is usually mandated by the host application or library.
Exposing more symbols than necessary can have negative effects on the object in many ways: it almost always increases the time necessary for the dynamic loader to completely load the object, and – if the internal symbols are not properly guarded – it can cause collisions between different objects, on operating systems using flat namespaces, as it is the case for Linux and most Unix-based systems.
Most, if not all, link editors allow to avoid this problem by defining a list of symbols to export; any symbol not in such lists will be hidden and thus not be part of the public interface of the object. Since the options used by the link editors to provide this function are not standard, libtool leverages it via three main options: -export-dynamic, -export-symbols and -export-symbols-regex.
3.1. -export-dynamic
The -export-dynamic option is used to declare that the interface exposed by the current object is to be used with the dlopen() anddlsym() functions (or their equivalent on non-Unix operating systems). This is the case for instance of all plugins, as seen inSection 2, "Building plugins".
This option is not commonly used for projects whose main target is Linux or other operating systems using ELF for their objects, as any symbol exposed by an ELF object is available to be accessed through the dlsym() function. It is a requirement, though, of other operating system that make difference whether the symbol should be resolved at build time or during execution, such as Windows.
3.2. -export-symbols and -export-symbols-regex
As the title implies, the -export-symbols and -export-symbols-regex are tightly related. They both are used to provide libtool with a list of symbols that should be exposed (the interface of the object).
The first option takes as a single parameter the path to a file, containing the list of symbols to expose, one per line. The second instead takes as a parameter a regular expression: symbols whose name matches the expression will be exposed by the object;libtool takes care of producing the list in that case.
Once libtool knows the list of symbols to expose, it then uses the link editor's own interface to complete the task; this is done through either linker scripts for Unix-derived link editors, or through definition lists for link editors for Windows, as they both serve similar purposes.
Example 3.7. Exposing only the public interface of a library via -export-symbols-regex
lib_LTLIBRARIES = libfoo.la
libfoo_la_SOURCES = foo1.c foo2.c foo3.c
libfoo_la_LDFLAGS = -export-symbols-regex '^foo_'
Using the -export-symbols-regex option makes it very easy to hide unnecessary symbols from a library's interface, but relies on the library being designed to use a regular pattern for naming of non-static functions and data symbols. In the earlier example, for instance, libtool will export all the symbols whose name start with foo_, assuming that the internal symbols use instead a prefix likex_foo or something along those lines.
When this assumption cannot be applied, you have instead to use the other option, -export-symbols, providing it with a complete list of the interfaces to export. The main downside to this method is, obviously, that you have to either manually compile it (which is prone to errors) or find a different, automated way to produce it, similarly to what libtool does when provided with a regular expression.
4. Library Versioning
One of the hardest part during the development of a library package is handling of version information, as it relates to the binary interface exposed by said libraries and is, thus, governed by those rules rather than what is a more human readable release version.
To understand what libtool support, in term of version handling, it's important to know what that version refers to. Libraries written in compiled languages – such as C, C++ and so on – have, in addition to the interface that the programmers need to know, an interface that the dynamic loader, and the consumers, know. The former is called API, the latter is called ABI.
The former includes, among others, the names of the functions, the meaning of the parameters, and the elements within structures. The latter adds to this the data types used for parameters and return values, and the order of the structures themselves. As you can imagine, it's easy to change the ABI without changing the API — a common way to do so is to change the type of a parameter fromint to long: users of the library will not have to change their code, but the size expected in the function, between the two versions, will change.
The rule of thumb, when writing libraries, is basing the release version on the API, which comes down to changing the major version when the interfaces are changed in a completely incompatible way, changing the minor version when interfaces are added, or changed in mostly-compatible ways, and finally increasing the micro, or "patch level" version at each release. Obviously, all the version components to the right of the one that is increased should then be reset to zero.
For what concerns the version of the ABI, it should not, generally, be tied to the release version; exceptions are possible, but will be documented later on. To solve the issue of ABI compatibility, both ELF and Mach-O provide a simple version number attached to the shared libraries. In the case of ELF, the pair of library name and version is recorded into a data field called DT_SONAME tag, and is used for listing dependencies within the DT_NEEDED tags of the library's consumer.
Since the version attached to a library refers to its ABI, whenever the ABI changes, the version need to change, even if this happens within the same minor version, just with a new release. This is the reason why the two versions are not supposed to have the same value. Whenever the ABI changes in an incompatible way, the DT_SONAME (and its equivalent for non-ELF systems) need to change, to make sure that the library is not loaded by incompatible consumers.
4.1. Setting the proper Shared Object Version
Developers working on Linux workstations will probably have noticed that most libraries built through libtool have three numbers at the end of their name, such as libfoo.so.0.0.0; this brings to an unfortunately incorrect implication that the version of the shared object uses the full three components. This is not the case. The version of the library is the one indicated in the DT_SONAME tag and is, generally, only one component, so in the aforementioned case, that would be libfoo.so.0.
To set the version of the library, libtool provides the -version-info parameter, which accepts three numbers, separated by colons, that are called respectively, current, revision and age. Both their name and their behaviour, nowadays, have to be considered fully arbitrary, as the explanation provided in the official documentation is confusing to say the least, and can be, in some cases, considered completely wrong.
Warning
A common mistake is to assume that the three values passed to -version-info map directly into the three numbers at the end of the library name. This is not the case, and indeed, current, revision and age are applied differently depending on the operating system that one is using.
For Linux, for instance, while the last two values map directly from the command-line, the first is calculated by subtracting age from current. On the other hand, on modern FreeBSD, only one component is used in the library version, which corresponds to current.
The rules of thumb, when dealing with these values are:
- Always increase the revision value.
- Increase the current value whenever an interface has been added, removed or changed.
- Increase the age value only if the changes made to the ABI are backward compatible.
The main reason for which libtool uses this scheme for version information, is that it allows to have multiple version of a given library installed, with both link editor and dynamic linker choosing the latest available at build and run time. With modern distributions packaging standards, this situation should not be happening anyway.
4.2. Internal libraries' versions
As noted in the previous section, the rules on versions for shared libraries are complex, and bothersome to maintain, but they are fundamental for the compatibility of the consumers of the library. On the other hand, sometimes it's possible to get away without having to follow these rules.
For internal libraries, that are shared among executables of the same package, but are not supposed to have any other consumer, and as such do not install headers, it's possible to simply use the -release option to provide a different name for the library at each release.
4.3. Multiple libraries versions
While libtool was designed to handle the presence of multiple libraries implementing the same API (and even ABI) on the system, distributions made that necessity moot. On the other hand, it is not uncommon for multiple versions of a library to be installed, with multiple API implemented, allowing consumers to pick their supported version. This is the case, for instance, of Gtk+ and Glib.
The first reaction would be to combine the two options, -release and -version-info; this would, though, be wrong. When using -release the static archive, the one with .a extension, the libtool archive (see Section 5, "Libtool Archives") and the .so file used by the link editor would not have a revision appended, which means that two different version of the library can't be installed at the same time.
In this situation, the best option is to append part of the library's version information to the library's name, which is exemplified by Glib's libglib-2.0.so.0 soname. To do so, the declaration in the Makefile.am has to be like this:
lib_LTLIBRARIES = libtest-1.0.la
libtest_1_0_la_LDFLAGS = -version-info 0:0:0
5. Libtool Archives
One of the most commonly misunderstood features of libtool, especially in modern days, is the presence of the *.la files, the so-called libtool archives. These files are simple textual file descriptors of a library built with libtool and are consumed only by libtooland, in some circumstances, libltdl.
The reason for their existence has to be found in the classical UNIX link editors, especially before the introduction of ELF for executables and shared objects. While this modern format carries over metadata for the libraries within the file itself, in particular version information (see Section 4, "Library Versioning") and dependencies, static archives and even older shared object formats do not provide that kind of information, so the eponymous .la is used to augment them.
These are the most important variables to be found in a libtool archive file (a one-liner description is available within the file itself):
dlname
Purportedly this is the name used when calling dlopen(); on ELF systems, this corresponds to the DT_SONAME tag on the shared object.
library_names
This is a list of equivalent names for the same library; it lists both the unversioned filename, the DT_SONAME one, and the fully-versioned file. More detailed information on the topic of versions within libtool are to be found in Section 4, "Library Versioning".
old_library
The name of the static archive, if any has been built, otherwise an empty string. This allows libtool to know whether static link against the library is possible at all.
inherited_linker_flags
This variable lists selected link editor flags that are needed to ensure the correct ABI is used with the library, for instance. In most cases, all you'll be seeing in here is -pthread, as it's probably the only such flag used on modern systems.
dependency_libs
With this list, libtool aim to compensate the lack of dependency information in static archives; indeed this list provides a similar list to the one in DT_NEEDED tags for ELF files.
Contrarily to the ELF dependencies, the list does not include version information of any kind, as it's designed to work in turn with static archives, which do not have version information either. Also, these dependencies are used for both static and dynamic link, causing no small headache for developers and distributors alike, as it is well possible that different versions of the same dependency are brought in, due to this list.
5.1. Usefulness of Libtool Archives in today's age
While .la files have been introduced with the aim of solving a real problem, as ELF systems weren't common at the time, their presence on most modern systems is vestigial at best. The only modern consumer of these files, as has been indicated, is libtoolitself, which already reduces their usefulness, as projects shy away from it, or even from the whole Autotools stack as a build system.
Note
It might not be obvious, but automake by default will not consult the Libtool Archives at all, when linking an executable. To make use of these files, you have to initialize and use libtool, even when building a project that consists only of executables.
Since initializing libtool in these situations will increase build time and complexity, even most Autotools based projects do not use it.
Historically, KDE 3 used to use a modified libltdl to load its architectural modules, which relied on the presence of the .la files for all the plugins. Other than that, even software using libltdl nowadays is not relying on these files, and rather access directly the generated plugin files (either in ELF or other formats).
Dependency tracking for static linking is, nowadays, mostly supplanted by the use of pkg-config, as can be read in Section 2.1, "Static linking". Since most build systems, custom and not, have easy access to the data stored in .pc files, without the need to use libtool when linking the consumer, the whole idea of Libtool Archives is, nowadays, considered obsolete.
Indeed, it is not uncommon for distributions to skip packaging of .la files altogether, at least for those packages whose detection is possible through pkg-config.
Chapter 4. Dependency discovery — pkg-config
Discovering the presence of libraries through autoconf as described in Section 4, "Finding Libraries" is somewhat limiting; it does not allow for the library to provide specific information about the flags it needs to compile against (such as threading flags), nor does it allow us to properly pass other libraries we need to link to.
To work around these limitations, a number of projects started including simple scripts, called libname-config for the most part, that provided the flags and libraries needed to build against. Since each of these scripts used a different interface, they usually came with their own custom macro to call the script and store the resulting information in a variable.
Given the similitude of these scripts, the GNOME project created years ago an unified interface for them, in the form of the pkg-config command, that instead of including the data within the script uses simple text files to store and access it.
Nowadays, the script is used for much more than to simply discover the libraries and compiler flags, as it has grown to allow for special handling of static linking, and can provide data from arbitrary variables.
1. File Format of *.pc Files
The heart of pkg-config lies in the data files that the various applications install. These data files are actually simple text files with some special syntax thrown in. They are neither INI-style configuration files, nor simple key-value pairs and are not even complete scripts.
The name of the file is the name of the module that can be tested with PKG_CHECK_MODULES function. The content is simple text, which can be divided into variables definitions and keyword declaration. Both are designed to be kept in a single line.
Variables can be used to define temporary values, but they can also provide arbitrary information to pkg-config users when needed. Keywords instead are pre-defined and are used by the commands available as part of pkg-config itself.
Name
Provides a human-readable name for the package; it does not have to be the same as the module name, that is instead decided based on the datafile's name.
Version
Complete version of the package; it supports most of the sane version specifications. Please note that only a single data file for a module can be used so you might have to duplicate part of the version information in both the module name and this keyword.
Requires, Conflicts
These terms specify the dependencies of a module, with or without version limitations. As the names of the terms indicate,Requires lists the other modules that need to be present, while Conflicts lists the packages that cannot be present when making use of the current module.
You cannot list the same module more than once in the requirements, but you can list it as many time as you want in the conflicts. All the modules can optionally have a version specifier, and you can use the seven basic comparisons as defined by the C language: =, <, >, <= and >=.
Cflags, Libs
The two fundamental specifications that the pkg-config call will report to its caller, such as the macro discussed in Section 3, "The PKG_CHECK_MODULES Macro". They represent the parameters to pass to the compiler and linker command-lines to make use of the current module.
It's important not to list the entries related to further dependencies, since pkg-config will take care of running transitive dependency discovery, see also Section 2, "Dependencies".
Requires.private, Libs.private
More specific details about the dependencies of a module, see Section 2.1, "Static linking".
Description, URL
Brief information about the package, mostly self-explaining.
1.1. Search paths
By default, pkg-config will search for modules installing data files in two directories: one for the architecture-specific modules, that is installed in a sub-directory of the libdir (usually /usr/lib/pkgconfig), and one for non-architecture specific modules, that can be shared among multiple architectures (usually /usr/share/pkgconfig).
You can add further paths to its search by defining the PKG_CONFIG_PATH, or you can replace the default search paths by setting thePKG_CONFIG_LIBDIR. These tricks are often used to create wrappers to pkg-config for cross-compilation. As described in Section 4, "Supporting Cross-Compilation".
1.2. Include and library paths
When providing the compiler and linker flags, you should always provide those to direct said tool to the paths where the library and its headers are to be found (namely, -I and -L, respectively). It is thus common practice to get the configure script to substitute variables on a template file, and use them to set the flags.
Example 4.1. Simple project's template file and configure.ac code.
The following code is a (snippet of) a template file that would then be named along the lines of foo.pc.in:
prefix=@prefix@
exec_prefix=@exec_prefix@
libdir=@libdir@
includedir=@includedir@
[...]
Cflags: -I${includedir}
Libs: -L${libdir} -lfoo
In the configure.ac files you would have, toward the end:
AC_CONFIG_FILES([foo.pc])
AC_OUTPUT
In the template files this way, it's common to have, beside the obvious definition of libdir and includedir, the definition for prefixand exec_prefix. The reason for this, is that the variables are defined by default, in autoconf-generated code, as relative to another variable, like this:
prefix=/usr/local
exec_prefix=${prefix}
libdir=${prefix}/lib
includedir=${prefix}/include
These values require the user to only override the origin point (prefix) and offsets all the other paths at the same time. Since pkg-config has been designed to work in pair with the rest of the autotools stack, the same expansion rules apply, making it easy to deal with, as long as the previously shown example is followed.
It is important to note that, when the paths are known to pkg-config as matching the system default search paths, they are not emitted in the output usage, to avoid contaminating the compile and link command lines with duplicate search paths that could slow it down considerably and, in some cases, cause cross-compiling wrappers to fail.
2. Dependencies
The pkg-config file format provides two main interfaces to declare dependencies between libraries (and other project using these files), as described in Section 1, "File Format of *.pc Files": the Requires definitions and the Libs definition. Both of them also provide a .private variant. While this might sound redundant, their use is different enough to warrant their presence.
The Requires definition allows to import dependency information directly from another .pc data file, including compiler flags (Cflags), libraries, and its own set of required libraries.
The Libs definition instead only lists libraries, without inspecting their dependencies; this should usually be used for libraries that do not provide a .pc file, such as system libraries.
The reason why two sets of definitions are present has to be found in the way the link editors work in the UNIX world (Windows and Mac OS X are different worlds), and in particular in the world of ELF: shared libraries carry internally the list of their dependencies, but static libraries (archives) do not. This means that if you're linking statically to something, you need to provide its dependencies explicitly.
The decision on where to list a given dependency should follow a semantic approach: does the current library augment the dependency, or wrap around it? If the consumers of the current library still need to use interfaces from the former, the dependency should be visible directly to the consumers, so declare it as Requires or Libs. If, on the other hand, the dependency's interface is wrapper and hidden from the consumers, it should be declared in Requires.private or Libs.private.
2.1. Static linking
When using the private definitions for dependencies, the behaviour of pkg-config will change depending on whether it's targeting dynamic or static linking. In the former case, private dependencies will not be listed in the output, as they are not exposed to the consumer directly, and shared libraries will handle them.
Contrarily, when the link is static, it'll expand (recursively) all the private dependencies so that all the libraries are brought in. This is once again needed because of the way UNIX link editors work, as archives do not list their own dependencies directly. While libtoolactually provides a wrapper around these archives, they are getting less and less common, so relying on them is not a good idea.
Unfortunately, at least as of February 2013, there is no easy way to tell at once to an Autotools-based build system that you intend to link statically; this means that the most common way to signal this to pkg-config becomes the command ./configure LDFLAGS="-static" PKG_CONFIG="pkg-config --static".
3. The PKG_CHECK_MODULES Macro
The main interface between autoconf and pkg-config is the PKG_CHECK_MODULES macro, which provides a very basic and easy way to check for the presence of a given package in the system. Nonetheless, there are some caveats that require attention when using the macro.
3.1. Syntax
PKG_CHECK_MODULES(prefix, list-of-modules, action-if-found, action-if-not-found)
prefix
Each call to PKG_CHECK_MODULES should have a different prefix value (with a few exceptions discussed later on). This value, usually provided in uppercase, is used as prefix to the variables holding the compiler flags and libraries reported by pkg-config.
For instance, if your prefix was to be FOO you'll be provided two variables FOO_CFLAGS and FOO_LIBS.
This will also be used as message during the configure checks: checking for FOO....
list-of-modules
A single call to the macro can check for the presence of one or more packages; you'll see later how to make good use of this feature. Each entry in the list can have a version comparison specifier, with the same syntax as the Requires keyword in the data files themselves.
action-if-found, action-if-not-found
As most of the original autoconf macros, there are boolean values provided, for the cases when the check succeeded or failed. In contrast with almost all of the original macros, though, the default action-if-not-fault will end the execution with an error for not having found the dependency.
3.2. Default variables
By default, the macro will set up two variables, joining the given prefix with the suffixes _CFLAGS and _LIBS. The names of these variables can be somewhat misleading, since the former will generally provide the flags to pass to the preprocessor, rather than the compiler, such as include paths and macro definitions, and the latter will provide the library paths as well as the libraries themselves.
On older versions of pkg-config, the macro will not call AC_SUBST on these variables; modern versions (at least version 0.24) will take care of that already. Running it twice, though, will not cause problems, so if you have doubts, you should add a snippet like the following.
AC_SUBST([FOO_CFLAGS])
AC_SUBST([FOO_LIBS])
In addition to defining these variables, the macro also declare them as important variables through AC_ARG_VAR so that the user can override the values if needed.
3.3. Modules specification
Beside checking for the presence of a package, pkg-config can also check for a minimum (or maximum) version of a package, by using C-style comparison operators, so that you can ensure that the correct version of a package will be used, and not an older or newer version.
You can also check for multiple packages at the same time, by listing one after the other separated by whitespace. This has the positive effect of emitting a reduced amount of code rather than testing them separately, but also brings one further problem: since the variables are supposed to be overridden, merging multiple packages together will require the users to parse the file to make sure they are passing the values for all of them.
Example 4.2. Example of module specifications for PKG_CHECK_MODULES
PKG_CHECK_MODULES([FOO], [foo >= 3])
PKG_CHECK_MODULES([BAR], [bar < 4])
PKG_CHECK_MODULES([BAZ], [baz = 2])
PKG_CHECK_MODULES([DEPENDENCIES], [foo >= 3 bar baz <= 4])
3.4. Optional Modules
Sometimes, you're supposed to check for given modules only under some conditions; it's a trivial setup, but it's one of the most common mistakes. The pkg-config command is discovered through a separate macro, PKG_PROG_PKG_CONFIG that takes care of identifying the presence (and version) of pkg-config itself. This macro is called through AC_REQUIRE so that is expanded beforePKG_CHECK_MODULES.
If you have the first call to PKG_CHECK_MODULES inside a bash conditional block, the expansion of PKG_PROG_PKG_CONFIG will also be conditional, so the following code will fail to work, when the condition is false:
AC_ARG_WITH([gtk], AS_HELP_STRING([--with-gtk], [Build with the GTK+ interface]))
if test "x$with_gtk" = "xyes"; then
PKG_CHECK_MODULES([GTK], [gtk+-2.0])
fi
PKG_CHECK_MODULES([GLIB], [glib-2.0])
Since the check for GTK+ will not work by default, you'll receive the following error if you try to execute configure without any argument:
% ./configure
checking for GLIB... no
configure: error: in `/path':
configure: error: The pkg-config script could not be found or is too old. Make sure it
is in your PATH or set the PKG_CONFIG environment variable to the full
path to pkg-config.
Alternatively, you may set the environment variables GLIB_CFLAGS
and GLIB_LIBS to avoid the need to call pkg-config.
See the pkg-config man page for more details.
To get pkg-config, see <http://pkg-config.freedesktop.org/>.
See `config.log' for more details.
You can solve this problem in two ways; you can either explicitly call PKG_PROG_PKG_CONFIG outside of any conditional, forcing checking for pkg-config as soon as possible; or you can rewrite your conditionals to use the proper syntax, as discussed in Section 1, "M4sh".
The proper code written in M4sh for the above logic is the following:
AC_ARG_WITH([gtk], AS_HELP_STRING([--with-gtk], [Build with the GTK+ interface]))
AS_IF([test "x$with_gtk" = "xyes"], [
PKG_CHECK_MODULES([GTK], [gtk+-2.0])
])
PKG_CHECK_MODULES([GLIB], [glib-2.0])
3.5. Alternatives
Sometimes you need to check for alternative modules; for instance you might fall-back from udev to HAL if the former cannot be found. You can easily write this by chaining the PKG_CHECK_MODULES call, through the action-if-not-found parameter, keeping the best choice outward:
PKG_CHECK_MODULES([UDEV], [libudev],
[AC_DEFINE([HAVE_UDEV], [1], [Use UDEV])],
[PKG_CHECK_MODULES([HAL], [hal],
[AC_DEFINE([HAVE_HAL], [1], [Use HAL])
])
])
It's important here to note that all the parameters here are quoted; this is important; if you don't quote the chainedPKG_CHECK_MODULES call properly, you will receive a syntax error when executing ./configure.
As an exception to the rule declared earlier, it's possible to use two chained calls to PKG_CHECK_MODULES with the same prefix; this is useful to identify cases where you need co-variant versions of two packages (explicitly) or if a package renames its own data file.
PKG_CHECK_MODULES([NM], [libnm-glib],, [
PKG_CHECK_MODULES([NM], [libnm_glib])
])
Even here, remember to quote the chained call.
4. Supporting Cross-Compilation
The design of the current pkg-config application and the interface of PKG_PROG_PKG_CONFIG allows them to be instrumental in proper cross-compilation of software, when used correctly. This only requires following a few simple rules.
4.1. Paths Handling
When cross-compiling packages with multiple dependencies or entire operating system images, the focus is usually around one specific directory, called sysroot, used as prefix while mimicking the installation layout of a normal running system. This path needs to be prefixed to the paths added to the list of searched paths for headers and libraries, i.e., those that are passed to -I/-L, respectively to the compiler and link editor. At the same time, it should not be compiled in programs to refer to at runtime, nor it should be used as the destination path during installation.
Since pkg-config original, and main, task is to report flags and paths, it is crucial that the sysroot handling is taken into consideration. At the time of writing, with version 0.25 of pkgconfig package, this is achieved mainly through thePKG_CONFIG_SYSROOT_DIR variable, which is set to the path of the sysroot, and is inserted in-between the -I or -L flags and the following path.
Important
The content of PKG_CONFIG_SYSROOT_DIR is not injected in paths that are returned by pkg-config --variable, which makes them unsuitable to use during cross-compilation unless specifically designed to be used at that time.
To design a variable to contain a path that needs to be used at build time, such as the path where a generation script is, you can prefix it in the .pc file with the built-in variable ${pc_sysrootdir}.
4.2. Tool Calling Conventions
Often, during cross-compilation, builds are mixed of tools to use on the host, and libraries to install on the target, making it unfeasible to simply set PKG_CONFIG_SYSROOT_DIR during the build. To cope with this, the usual method to set the variable is to use a wrapper script, with either a custom a general ${CHOST}-pkg-config name.
This is supported by the autoconf macros provided by the package, as they all respect $PKG_CONFIG if set in the environment, and look for a target tool (${CHOST}-pkg-config) before falling back to the usual pkg-config command.
Important
When using the tool to identify variables within a configure.ac or Makefile.am file, it is thus important to not call it directly, but to rather call $PKG_CONFIG so to not bypass sysroot awareness.
It also requires other build systems to respect the value set into the environment, the code for which depends on a system by system basis.
The wrapper script should not only set the PKG_CONFIG_SYSROOT_DIR variable: when cross-compiling you want to ignore the packages installed in the system, and instead rely only on those installed in the cross-compiled environment. This is achieved by resettingPKG_CONFIG_DIR (which lists additional search paths), and at the same time setting PKG_CONFIG_LIBDIR to override the default base search paths.
As of pkg-config version 0.28, a tool-prefixed executable, with the same name as the wrapper documented in this section, is installed by default, both when cross compiling and when not, to support multiple ABI on the same system. This does not, though, make the wrapper approach obsolete, yet.
Example 4.3. Common pkg-config wrapper script for cross-compilation
#!/bin/sh
SYSROOT=/build/root
export PKG_CONFIG_DIR=
export PKG_CONFIG_LIBDIR=${SYSROOT}/usr/lib/pkgconfig:${SYSROOT}/usr/share/pkgconfig
export PKG_CONFIG_SYSROOT_DIR=${SYSROOT}
exec pkg-config "$@"
Chapter 5. Forward Porting Notes
Like most software, autoconf, automake and the rest of the autotools family are not set in stone. Each new release strives to improve itself and the way it is used.
This causes a phenomenon of "bit-rotting", similar to the one caused by compilers and interpreters on source code: sources that used to work just fine will start issuing warnings, and further on, it starts not to work any longer.
While the direct reason for this to happen are the changes in the tools, that become stricter in what they accept, or simply change their behaviour to increase compatibility and speed, the underlying reasons are that build systems often use undocumented features, or make mistakes in the way they get used.
For this reason it is important to know exactly what changes between autotools versions, for the developers using them. By knowing the changes in behaviour, it is possible to ensure that a given build system works with the newest released version of the software.
1. Changes in autoconf releases
1.1. autoconf 2.64
Release 2.64 of autoconf is an important release for developers working on old build systems based on it, because it is the release where the "present but cannot be compiled" warning switches behaviour.
Noteworthy changes in autoconf version 2.64
- After eight years, AC_CHECK_HEADER takes as authoritative the result from the compiler, rather than the preprocessor, when the"header present but cannot be compiled" warning is issued. Refer to Section 4.2.2, "Headers Present But Cannot Be Compiled" for the details on how to fix the related issues.
- The internal macro AH_CHECK_HEADERS has been removed; while this is an internal change that should not mean anything to properly-written autotools, this actually breaks packages using the KDE 3.x autotools-based build system, in particular theKDE_CHECK_HEADERS macro. To work with autoconf 2.64, KDE3-based software should replace the calls to the KDE-specific macro with equivalent calls to the proper, standard AC_CHECK_HEADERS macro, after properly setting up the language to C++ if needed.
1.2. autoconf 2.68
There aren't many specific changes in autoconf 2.68; but this version provides a new milestone in the package's history as it solves the (many) regressions that were introduced in the past two versions. For this reason, for the scope of this document, I'll be documenting the changes in 2.66, 2.67 and 2.68 as a single section.
Warning
You are encouraged to avoid as much as possible version 2.66 and 2.67 of autoconf. While they do include a few regression fixes against 2.65, they also introduce a much longer series of mistakes and regressions that were fixed only in 2.68.
Noteworthy changes in autoconf version 2.66 through 2.68
- Macros designed to check for functionality (more than presence) of common library functions are being deprecated in favour of using the gnulib project framework. Eventually, alternative, slimmer macros might be looked up at the Autoconf archive. These macros are involved: AC_FUNC_ERROR_AT_LINE; AC_FUNC_LSTAT_FOLLOWS_SLASHED; AC_FUNC_MKTIME, AC_FUNC_STRTOD.
- The If-Else family of macros (see Section 5, "Custom Autoconf Tests") has gained a new safety switch to ensure that the source code being compiled is setting the expected defines that were discovered up to that point. If you call any of those macros with a literal text with the source code to work on, you'll be presented with a warning similar to the following:
configure.ac:XX: warning: AC_LANG_CONFTEST: no AC_LANG_SOURCE call detected in body
../../lib/autoconf/lang.m4:194: AC_LANG_CONFTEST is expanded from...
../../lib/autoconf/general.m4:2591: _AC_COMPILE_IFELSE is expanded from...
../../lib/autoconf/general.m4:2607: AC_COMPILE_IFELSE is expanded from...
configure.ac:XX: the top level
This means that calling macros such as AC_PREPROC_IFELSE or AC_LINK_IFELSE now requires the use of AC_LANG_SOURCE orAC_LANG_PROGRAM to generate the source code to compile. As an alternative, the AC_LANG_DEFINES_PROVIDED macro can be used within the first parameter to stop autoconf from warning you about it.
It is important to note that you need to ensure that the call to AC_LANG_SOURCE is quoted and not expanded, otherwise that will cause the warning to appear nonetheless. See the following code:
dnl Old-style code (will issue a warning)
AC_LINK_IFELSE([int main() { return 0; }],
[some=thing], [some=other])
dnl Wrongly updated code (will still issue a warning)
AC_LINK_IFELSE(AC_LANG_SOURCE([int main() { return 0; }]),
[some=thing], [some=other])
dnl Correctly updated code
AC_LINK_IFELSE([AC_LANG_SOURCE([int main() { return 0; }])],
[some=thing], [some=other])
2. Changes in automake release
About deprecation of automake features
The development, both historical and contemporary, of automake is full of features that are introduced, obsoleted, deprecated and removed. Due to this, it's important to know a few things about its deprecation process.
First of all, it's unfortunately common for newly-released series (1.12, 1.13, 1.14, ...) to have regressions that are addressed in subsequent micro-releases. When important regressions are present, a warning will be noted at the top of the list of changes.
More importantly, when features (variables, macros, …) are marked as deprecated, their use is causing a warning to be printed, as long as the -Wobsolete option is passed (which is not, by default). If this option is paired with -Werror, it can cause features that are still present, but deprecated, to trigger a failure in Makefile generation.
2.1. automake 1.12
Warning
Versions 1.12 and 1.12.1 of automake have been released with a regression in the behaviour, related to the deprecation of AM_PROG_MKDIR_P, as noted later in this section. It is thus of the utmost importance to not rely on the behaviour of these versions.
Unsupported behaviour in automake starting from version 1.12
- The long-deprecated AM_PROG_MKDIR_P macro (previously called by default by AM_INIT_AUTOMAKE) is now reporting itself as such; this, if combined with the -Werror option, will cause automake to fail on projects that are calling the macro, either explicitly, or through another macro, for example gettext's, which is still unfixed as of December 2012.
This function enabled the use of $(mkdir_p) and @mkdir_p@ in Makefile.am, which are still present and will still be available for version 1.14, but are also considered deprecated, and should thus not be used in new build systems. Please also note that automake releases 1.12 and 1.12.1 mistakenly removed the expansions for $(mkdir_p) and @mkdir_p@ altogether, breaking more than a few packages in the process.
See Section 4.1, "Parallel Install" for further details.
- The dist-lzma, used to produce .tar.lzma archives has been removed.
The LZMA compression format has undergone different, incompatible revisions over its short life, and has been deprecated for most uses. In its place you can use the new dist-xz option.
- The support for de-ANSI-fication (ansi2knr Automake option) that was deprecated in version 1.11 has been removed from this version.
This feature allowed for code to be written as specially-formatted ANSI C code, to be translated at build time through theansi2knr tool, bundled in the source distribution.
Since C has been standardized first in 1989, there should be no concern that any compiler will only accept pre-standard C code. Furthermore, the use of ansi2knr is not supported during cross-compilation, so there is no real benefit in using this feature any longer.
Even though this feature is hardly ever used, there are projects that still rely on the initialisation macro AM_C_PROTOTYPES; this macro is defined in automake 1.11 this way:
AC_DEFUN([AM_C_PROTOTYPES],
[AC_REQUIRE([AC_C_PROTOTYPES])
if test "$ac_cv_prog_cc_stdc" != no; then
U= ANSI2KNR=
else
U=_ ANSI2KNR=./ansi2knr
fi
# Ensure some checks needed by ansi2knr itself.
AC_REQUIRE([AC_HEADER_STDC])
AC_CHECK_HEADERS([string.h])
AC_SUBST([U])dnl
AC_SUBST([ANSI2KNR])dnl
_AM_SUBST_NOTMAKE([ANSI2KNR])dnl
])
This means that whenever the package is using AM_C_PROTOTYPES but doesn't have the ansi2knr.c source file in its distribution, the intended behaviour was probably to call for AC_C_PROTOTYPES.
Unfortunately as with other non-trivial macros, it's possible that projects rely on more than just the main side-effect of said macro, which means that if you want to replace its use, you have to also verify that there is no reliance on calls toAC_HEADER_STDC or on the check on string.h header that is not otherwise executed.
Example 5.1. Safe conversion of a configure.ac relying on AM_C_PROTOTYPES without de-ANSI-fication.
⋮
AC_C_PROTOYPES
AC_HEADER_STDC
AC_CHECK_HEADERS([string.h])
⋮
Note
Please consider verifying whether the calls to AC_HEADER_STDC and AC_CHECK_HEADERS are actually needed.
It is important to note that as of writing (July 2012), the AC_C_PROTOTYPES macro is not considered deprecated by autoconf but it is considered obsolete.
2.2. automake 1.13
Warning
Versions 1.13 and 1.13.1 of automake have been released with regressions in their behaviour, related to the deprecation of AM_PROG_CC_STDC and AM_CONFIG_HEADER macros. It is thus of the utmost importance to not rely on the behaviour of any version before 1.13.2.
Furthermore, Fedora patched version 1.13.1 to re-introduce the aforementioned macros, which means that the behaviour of any automake version 1.13.1 is distribution-dependent.
Unsupported and changed behaviour in automake starting from version 1.13
- Tests are now executed in parallel, rather than in series. You can return to the the old behaviour with the serial-tests option.
- The multi-argument AM_INIT_AUTOMAKE syntax is now considered fully obsolete. This syntax was long discontinued and has never been documented in this guide, but it's still being used for projects using dynamic version definition, due to a bug inautoconf which is still unfixed as of January 2013.
- The cygnus flavour for Automake has been removed. This might require a vast overhaul of the very few projects who still relied on this build flavour.
- A number of internal, deprecated aliases, not starting with the canonical AM_ prefix, have been removed. These were part of them4/obsolete.m4 macro file and should not really be of importance as they were mostly used by the projects that implemented them before being merged in automake.
Together with this, as of the original 1.13 release two more macros where removed: AM_PROG_CC_STDC (replaced by AC_PROG_CC as provided by autoconf) and AM_CONFIG_HEADER (replaced by AC_CONFIG_HEADERS). Version 1.13.1 re-introduced the macros, but only to throw a proper error message; version 1.13.2 finally reintroduced these, as deprecated, to catch-up with Fedora's 1.13.1 patched version.
2.3. automake 1.14
Unsupported and changed behaviour in automake starting from version 1.14
- The AM_PROG_CC_C_O macro has been deprecated, and its use now is a no-op. Instead, the compile script is now required for all builds, and the tests executed by this macro have been bolted on the AC_PROG_CC basic macro instead. Removal of this macro is scheduled for version 2.0.
- The dist-shar and dist-tarZ options have been deprecated and are scheduled to be removed for version 2.0.
2.4. automake 2.0
Starting with version 1.13, automake versioning scheme has changes so that backward-incompatible changes are only present in major version. The next major version of automake is version 2.0 due out in 2013.
Warning
This content is still in flux as this version of automake is not out yet. It'll be updated to the best of my abilities over time until release.
Unsupported and changed behaviour in automake starting from version 2.0
- Projects still using the old configure.in name for the autoconf script are no longer supported. Simply rename the file to the modern name of configure.ac.
3. Changes in libtool releases
3.1. libtool 2.2
The 2.2 series of libtool is a major overhaul and partial rewrite over the previous, widely adopted 1.5 series. Featuring an only partially compatible libltdl, this version introduced new macro names for initialisation and configuration, as well as new, slimmed down macro files, and smarter checks.
The number of problems related to porting to libtool 2.2 is quite high and some do require builder attention.
3.1.1. The C++ and Fortran support
Before libtool 2.2, as soon as a project started using the support provided by this package, checks for C++ and Fortran compilers were added. These checks, lengthy, and pointless for C-only projects, often caused grief to the point that many different hacks, using M4 directives, were used and suggested between different guides.
With the new release series, this problem has been solved, LT_INIT now only checks for the actual compilers as declared by the project itself. While this works for most projects, there are a few where this caused further problems, and further grief.
The problem appears evident when building packages written in C++ (but Fortran is mostly the same) that don't check for the proper compiler, since the automake execution will start reporting problems that "am__fastdepCXX does not appear inAM_CONDITIONAL":
% automake --add-missing --copy --foreign
/usr/share/automake-1.10/am/depend2.am: am__fastdepCXX does not appear in AM_CONDITIONAL
/usr/share/automake-1.10/am/depend2.am: The usual way to define `am__fastdepCXX' is to add `AC_PROG_CXX'
/usr/share/automake-1.10/am/depend2.am: to `configure.ac' and run `aclocal' and `autoconf' again.
cjeca32/Makefile.am: C++ source seen but `CXX' is undefined
cjeca32/Makefile.am: The usual way to define `CXX' is to add `AC_PROG_CXX'
cjeca32/Makefile.am: to `configure.ac' and run `autoconf' again.
The main problem here is that the error (divided in two parts) is actually meaningful only in the second part, for most people, since the first three lines sound like gibberish for the common users.
The second part of the error actually tells you exactly what to do: adding AC_PROG_CXX to configure.ac; even better, before the initialisation call.
Example 5.2. Properly Fix missing C++ support with libtool 2.2
dnl configure.ac
AC_INIT
AM_INIT_AUTOMAKE
dnl add this
AC_PROG_CXX
AC_PROG_LIBTOOL
Please note here, that the function AC_PROG_LIBTOOL in this snipped is the deprecated 1.5-compatible form of LT_INIT. Since the error shown above happens during porting to libtool 2.2, it's unlikely that the new form is found.
Glossary
Some of the terms used throughout the guide are not of common usage, but are almost always used to refer to simple concepts. The choice of more technical, less common names is often to be found in the requirement to be as little ambiguous as possible. For this reason I'm trying to avoid using, for instance, the term linker, that can be easily refer to two quite different, but near, pieces of software. To solve doubts regarding these terms, you can thus refer to the following glossary.
Acronyms and shorthands
ABI
See Application Binary Interface.
API
See Application Programming Interface.
ELF
See Executable and Linkable Format.
ld
See Link Editor.
ld.so
See Runtime Loader.
PE
See Portable Executable.
Concepts
Application Binary Interface
With ABI we refer to the interface of a library, as exposed to its consumers (other libraries or executables alike), after the build step. In case of software written in languages similar to C, the ABI is comprised of at least the name of the functions, the type and order of their parameters, the type and size of the data it exports, and the content, order and size of its structures.
See Also Application Programming Interface.
Application Programming Interface
The generic term API refers to all the interfaces (or points of contact) between one application and another. When referring to libraries, the term takes the specific meaning of the set of functions, constants and variables defined in the library's public headers, to be further transformed into an ABI during the build step.
See Also Application Binary Interface.
Dynamic Linker
See Runtime Loader.
Executable and Linkable Format
The file format used in Linux, Solaris, FreeBSD and other operating systems to represent executable files (programs), shared objects as well as intermediate object files. The format is designed to be extensible, so that not all modern features are supported on each operating system.
See Also Portable Executable.
Link Editor
The application used to transform any number of translation units into a different object of different nature. Loosely referred to as linker, or by the common shorthand ld used on Unix systems to execute it. Not to be confused with the dynamic linker, which in this context is referred to as runtime loader.
See Also Runtime Loader.
Linker
See Link Editor.
Portable Executable
The file format used by Microsoft Windows to represent executable files (programs) as well as shared objects. While for a long time it was almost impossible to find PE files for architectures different from i386, the format was originally designed to be usable on a wide range of CPU architectures, at Windows NT originally supported MIPS, PowerPC and Alpha as well as i386; nowadays non-i386 PE files are commonly used on Windows CE embedded systems as well as alternative Windows-based platforms.
See Also Executable and Linkable Format.
Runtime Loader
The software component, usually invoked by the kernel, that reads, loads in memory, and sets up a dynamic executable file and its dependencies. Loosely referred to as dynamic linker, runtime linker or dynamic loader or by its shorthand ld.so as it is called on Unix systems. Not to be confused with the linker, which in this context is referred to as link editor.
Runtime Linker
See Runtime Loader.
Index
Symbols
$build (variable) (see CBUILD (variable))
$host (variable) (see CHOST (variable))
$target (variable) (see CTARGET (variable))
A
aclocal, Using AC_CONFIG_MACRO_DIR (and aclocal)
AC_ARG_ENABLE (macro), AC_ARG_ENABLE and AC_ARG_WITH
AC_ARG_VAR (macro), Environment Variables as Arguments
AC_ARG_WITH (macro), AC_ARG_ENABLE and AC_ARG_WITH
AC_CACHE_VAL (macro), Caching Results
AC_CANONICAL_BUILD (macro), Canonical Systems
AC_CANONICAL_HOST (macro), Canonical Systems
AC_CANONICAL_TARGET (macro), Canonical Systems
AC_CHECK_HEADER, autoconf 2.64
AC_CHECK_HEADER (macro), Checking For Headers
AC_CHECK_HEADERS (macro), Checking For Headers
AC_CHECK_HEADERS_ONCE (macro), Once-expanded checks
AC_COMPILE_IFELSE (macro), "Build Tests", autoconf 2.68
AC_CONFIG_MACRO_DIR (macro), Using AC_CONFIG_MACRO_DIR (and aclocal)
AC_CONFIG_MACRO_DIRS (macro), Using AC_CONFIG_MACRO_DIR (and aclocal)
AC_C_PROTOTYPES (macro), automake 1.12
AC_DEFUN_ONCE (macro), Once-Expansion
AC_FUNC_ERROR_AT_LINE, autoconf 2.68
AC_FUNC_LSTAT_FOLLOWS_SLASHED_SYMLINK, autoconf 2.68
AC_FUNC_MKTIME, autoconf 2.68
AC_FUNC_STRTOD, autoconf 2.68
AC_LANG_CALL (macro, deprecated), Tests' Input Sources
AC_LANG_DEFINES_PROVIDED (macro), Tests' Input Sources, autoconf 2.68
AC_LANG_FUNC_LINK_TRY (macro, deprecated), Tests' Input Sources
AC_LANG_PROGRAM (macro), Tests' Input Sources, autoconf 2.68
AC_LANG_SOURCE (macro), Tests' Input Sources, autoconf 2.68
AC_LINK_IFELSE (macro), "Build Tests", autoconf 2.68
AC_PREPROC_IFELSE (macro), "Build Tests", autoconf 2.68
AC_RUN_IFELSE (macro), "Run Tests", autoconf 2.68
AC_SEARCH_LIBS (macro), Searching For System Libraries
AC_TEST_PROGRAM (macro, renamed) (see AC_RUN_IFELSE)
AC_TRY_CPP (macro, obsolete) (see AC_PREPROC_IFELSE)
AC_TRY_RUN (macro, renamed) (see AC_RUN_IFELSE)
AH_CHECK_HEADERS (macro), autoconf 2.64
AM_C_PROTOTYPES (macro, deprecated), automake 1.12
AM_INIT_AUTOMAKE (macro), Available options, automake 1.13
AM_PROG_CC_C_O (macro, deprecated), automake 1.14
AM_PROG_MKDIR_P (macro, deprecated), automake 1.12
AM_SILENT_RULES (macro) (see automake options, silent-rules)
AS_CASE (macro), M4sh
AS_IF (macro), M4sh
autoconf, Configuring The Build — autoconf, Changes in autoconf releases
automake, Harnessing Make — automake, Changes in automake release
automake options, Available options, Automake flavours, Automake flavours, Automake flavours, Automake flavours, Silent Building with Automake
ansi2knr (deprecated), automake 1.12
cygnus, Automake flavours, automake 1.13
dist-lzma (deprecated), automake 1.12
dist-shar (deprecated), automake 1.14
dist-tarZ (deprecated), automake 1.14
foreign, Automake flavours
gnits, Automake flavours
gnu, Automake flavours
parallel-tests, automake 1.13
serial-tests, automake 1.13
silent-rules, Silent Building with Automake
subdir-objects, Achieving Non-Recursive Make
C
CBUILD (variable), Canonical Systems
CHOST (variable), Canonical Systems
common errors
am__fastdepCXX does not appear in AM_CONDITIONAL, The C++ and Fortran support
present but cannot be compiled, Headers Present But Cannot Be Compiled
config.cache (file), Caching Results
configure.in (file), automake 2.0
CTARGET (variable), Canonical Systems
cygnus (automake flavour), Automake flavours, automake 1.13
F
foreign (automake flavour), Automake flavours
G
gnits (automake flavour), Automake flavours
gnu (automake flavour), Automake flavours
GNU autoconf (see autoconf)
GNU automake (see automake)
GNU libtool (see libtool)
L
libltdl, Using libltdl for plug-ins
libtool, Building All Kinds of Libraries — libtool, Changes in libtool releases
libtool options, Exposing and Hiding Symbols, Exposing and Hiding Symbols, -export-dynamic
-export-dynamic, -export-dynamic
-export-symbols, Exposing and Hiding Symbols
-export-symbols-regex, Exposing and Hiding Symbols
-no-install, libtool wrappers
LTDL_INIT (macro), Linking, bundling, installing libtldl
LT_CONFIG_LTDL_DIR (macro), Linking, bundling, installing libtldl
M
M4sh, M4sh
m4_include (macro), With Just autoconf
Makefile.am variables
ACLOCAL_AMFLAGS (deprecated), Using AC_CONFIG_MACRO_DIR (and aclocal)
P
pkg-config, Dependency discovery — pkg-config
PKG_CHECK_MODULES (macro), The PKG_CHECK_MODULES Macro
PKG_CONFIG_LIBDIR (variable), Search paths
PKG_CONFIG_PATH (variable), Search paths
S
silent-rules (see automake options, silent-rules)
Appendix A. Who's afraid of Autotools?
Quick tutorial for autotoolizing a project
Table of Contents
While the whole guide is designed toward explaining the mechanisms and the behaviour of autotools, it was never meant, by itself, to be a tutorial. This appendix should cover those basis, by providing an easy path on writing an autotools build system for your project.
1. Basic Autotoolization
As already described in the rest of this guide, the name "Autotools", we refer to a number of different tools. If you have a very simple program (not hellow-simple, but still simple), you definitely want to use at the very least two: autoconf and automake. While you could use the former without the latter, you really don't want to. This means that you need two files: configure.ac and Makefile.am.
The first of the two files is processed to produce a configure script which the user will be executing at build time. It is also the bane of most people because, if you look at one for a complex project, you'll see lots of content (and logic) and next to no comments on what things do.
Lots of it is cargo-culting and I'm afraid I cannot help but just show you a possible basic configure.ac file:
AC_INIT([myproject], [123], [flameeyes@flameeyes.eu], [http://blog.flameeyes.eu/tag/autotoolsmythbuster])
AM_INIT_AUTOMAKE([foreign no-dist-gz dist-xz])
AC_PROG_CC
AC_OUTPUT([Makefile])
The first two lines are used to initialize autoconf and automake respectively. The former is being told the name and version of the project, the place to report bugs, and an URL for the package to use in documentation. The latter is told that we're not a GNU project (seriously, this is important — this way you can avoid creating 0-sized files just because they are mandatory in the default GNU layout, like NEWS), and that we want a .tar.xz tarball and not a .tar.gz one (which is the default). See Section 1, "Available options" for more details.
After initializing the tools, you need to, at the very least, ask for a C compiler. You could have asked for a C++ compiler as well, but I'll leave that as an exercise to the reader. Finally, you got to tell it to output Makefile (it'll use Makefile.in but we'll create Makefile.aminstead soon).
To build a program, you need then to create a Makefile.am similar to this:
bin_PROGRAMS = hellow
dist_doc_DATA = README
Here we're telling automake that we have a program called hellow (which sources are by default hellow.c) which has to be installed in the binary directory, and a README file that has to be distributed in the tarball and installed as a documentation piece. Yes this is really enough as a very basic Makefile.am.
If you were to have two programs, hellow and hellou, and a convenience library between the two you could do it this way:
bin_PROGRAMS = hellow hellou
hellow_SOURCES = src/hellow.c
hellow_LDADD = libhello.a
hellou_SOURCES = src/hellou.c
hellow_LDADD = libhello.a
noinst_LIBRARIES = libhello.a
libhello_a_SOURCES = lib/libhello.c lib/libhello.h
dist_doc_DATA = README
But then you'd have to add AC_PROG_RANLIB to the configure.ac calls. My suggestion is that if you want to link things statically and it's just one or two files, just go for building it twice… it can actually makes it faster to build (one less serialization step) and with the new LTO options it should very well improve the optimization as well.
2. Adding libtool
Let's start from a fundamental rule: if you're not going to install a library, you don't want to use libtool. Some projects that only ever deal with programs still use it because that way they can rely on .la files for static linking. My suggestion is (very simply) not to rely on them as much as you can. Doing it this way means that you no longer have to care about using libtool for non-library-providing projects.
But in the case you are building any library, using libtool is important. Even if the library is internal only, trying to build it withoutlibtool is just going to be a big headache for the packager that looks into your project. Before entering the details on how you use this tool, though, let's look into something else: what you need to make sure you think about, in your library.
First of all, make sure to have an unique prefix to your public symbols, be them constants, variables or functions. You might also want to have one for symbols that you use within your library on different translation units — my suggestion in this example is going to be that symbols starting with foo_ are public, while symbols starting with foo__ are private to the library. You'll soon see why this is important.
Reducing the amount of symbols that you expose is not only a good performance consideration, but it also means that you avoid the off-chance to have symbol collisions which is a big problem to debug. So do pay attention. There is another thing that you should consider when building a shared library and that's the way the library's ABI is versioned but it's a topic that goes in quite deep, so just see Section 4, "Library Versioning" for further details.
Once you got these details sorted out, you should start by slightly change the configure.ac file from the previous post so that it initializes libtool as well:
AC_INIT([myproject], [123], [flameeyes@flameeyes.eu], [http://blog.flameeyes.eu/tag/autotoolsmythbuster])
AM_INIT_AUTOMAKE([foreign no-dist-gz dist-xz])
LT_INIT
AC_PROG_CC
AC_OUTPUT([Makefile])
Now it is possible to provide a few options to LT_INIT for instance to disable by default the generation of static archives. My personal recommendation is not to touch those options in most cases. Packagers will disable static linking when it makes sense, and if the user does not know much about static and dynamic linking, they are better off getting everything by default on a manual install.
On the Makefile.am side, the changes are very simple. Libraries built with libtool have a different class than programs and static archives, so you declare them as lib_LTLIBRARIES with a .la extension (at build time this is unavoidable). The only real difference between _LTLIBRARIES and _PROGRAMS is that the former gets its additional links from _LIBADD rather than _LDADD like the latter.
bin_PROGRAMS = fooutil1 fooutil2 fooutil3
lib_LTLIBRARIES = libfoo.la
libfoo_la_SOURCES = lib/foo1.c lib/foo2.c lib/foo3.c
libfoo_la_LIBADD = -lz
libfoo_la_LDFLAGS = -export-symbols-regex '^foo_[^_]'
fooutil1_LDADD = libfoo.la
fooutil2_LDADD = libfoo.la
fooutil3_LDADD = libfoo.la -ldl
pkginclude_HEADERS = lib/foo1.h lib/foo2.h lib/foo3.h
The _HEADERS variable is used to define which header files to install and where. In this case, it goes into ${prefix}/include/${PACKAGE}, as I declared it a pkginclude install.
The use of -export-symbols-regex (See also Section 3, "Exposing and Hiding Symbols") ensures that only the symbols that we want to have publicly available are exported and does so in an easy way.
Appendix B. License
CreativeCommons Attribution-NonCommercial-ShareAlike 3.0 Unported
Table of Contents
THE WORK (AS DEFINED BELOW) IS PROVIDED UNDER THE TERMS OF THIS CREATIVE COMMONS PUBLIC LICENSE ("CCPL" OR "LICENSE"). THE WORK IS PROTECTED BY COPYRIGHT AND/OR OTHER APPLICABLE LAW. ANY USE OF THE WORK OTHER THAN AS AUTHORIZED UNDER THIS LICENSE OR COPYRIGHT LAW IS PROHIBITED.
BY EXERCISING ANY RIGHTS TO THE WORK PROVIDED HERE, YOU ACCEPT AND AGREE TO BE BOUND BY THE TERMS OF THIS LICENSE. TO THE EXTENT THIS LICENSE MAY BE CONSIDERED TO BE A CONTRACT, THE LICENSOR GRANTS YOU THE RIGHTS CONTAINED HERE IN CONSIDERATION OF YOUR ACCEPTANCE OF SUCH TERMS AND CONDITIONS.
1. Definitions
- "Adaptation" means a work based upon the Work, or upon the Work and other pre-existing works, such as a translation, adaptation, derivative work, arrangement of music or other alterations of a literary or artistic work, or phonogram or performance and includes cinematographic adaptations or any other form in which the Work may be recast, transformed, or adapted including in any form recognizably derived from the original, except that a work that constitutes a Collection will not be considered an Adaptation for the purpose of this License. For the avoidance of doubt, where the Work is a musical work, performance or phonogram, the synchronization of the Work in timed-relation with a moving image ("synching") will be considered an Adaptation for the purpose of this License.
- "Collection" means a collection of literary or artistic works, such as encyclopedias and anthologies, or performances, phonograms or broadcasts, or other works or subject matter other than works listed in Section 1(g) below, which, by reason of the selection and arrangement of their contents, constitute intellectual creations, in which the Work is included in its entirety in unmodified form along with one or more other contributions, each constituting separate and independent works in themselves, which together are assembled into a collective whole. A work that constitutes a Collection will not be considered an Adaptation (as defined above) for the purposes of this License.
- "Distribute" means to make available to the public the original and copies of the Work or Adaptation, as appropriate, through sale or other transfer of ownership.
- "License Elements" means the following high-level license attributes as selected by Licensor and indicated in the title of this License: Attribution, Noncommercial, ShareAlike.
- "Licensor" means the individual, individuals, entity or entities that offer(s) the Work under the terms of this License.
- "Original Author" means, in the case of a literary or artistic work, the individual, individuals, entity or entities who created the Work or if no individual or entity can be identified, the publisher; and in addition (i) in the case of a performance the actors, singers, musicians, dancers, and other persons who act, sing, deliver, declaim, play in, interpret or otherwise perform literary or artistic works or expressions of folklore; (ii) in the case of a phonogram the producer being the person or legal entity who first fixes the sounds of a performance or other sounds; and, (iii) in the case of broadcasts, the organization that transmits the broadcast.
- "Work" means the literary and/or artistic work offered under the terms of this License including without limitation any production in the literary, scientific and artistic domain, whatever may be the mode or form of its expression including digital form, such as a book, pamphlet and other writing; a lecture, address, sermon or other work of the same nature; a dramatic or dramatico-musical work; a choreographic work or entertainment in dumb show; a musical composition with or without words; a cinematographic work to which are assimilated works expressed by a process analogous to cinematography; a work of drawing, painting, architecture, sculpture, engraving or lithography; a photographic work to which are assimilated works expressed by a process analogous to photography; a work of applied art; an illustration, map, plan, sketch or three-dimensional work relative to geography, topography, architecture or science; a performance; a broadcast; a phonogram; a compilation of data to the extent it is protected as a copyrightable work; or a work performed by a variety or circus performer to the extent it is not otherwise considered a literary or artistic work.
- "You" means an individual or entity exercising rights under this License who has not previously violated the terms of this License with respect to the Work, or who has received express permission from the Licensor to exercise rights under this License despite a previous violation.
- "Publicly Perform" means to perform public recitations of the Work and to communicate to the public those public recitations, by any means or process, including by wire or wireless means or public digital performances; to make available to the public Works in such a way that members of the public may access these Works from a place and at a place individually chosen by them; to perform the Work to the public by any means or process and the communication to the public of the performances of the Work, including by public digital performance; to broadcast and rebroadcast the Work by any means including signs, sounds or images.
- "Reproduce" means to make copies of the Work by any means including without limitation by sound or visual recordings and the right of fixation and reproducing fixations of the Work, including storage of a protected performance or phonogram in digital form or other electronic medium.
2. Fair Dealing Rights.
Nothing in this License is intended to reduce, limit, or restrict any uses free from copyright or rights arising from limitations or exceptions that are provided for in connection with the copyright protection under copyright law or other applicable laws.
3. License Grant.
Subject to the terms and conditions of this License, Licensor hereby grants You a worldwide, royalty-free, non-exclusive, perpetual (for the duration of the applicable copyright) license to exercise the rights in the Work as stated below:
- to Reproduce the Work, to incorporate the Work into one or more Collections, and to Reproduce the Work as incorporated in the Collections;
- to create and Reproduce Adaptations provided that any such Adaptation, including any translation in any medium, takes reasonable steps to clearly label, demarcate or otherwise identify that changes were made to the original Work. For example, a translation could be marked "The original work was translated from English to Spanish," or a modification could indicate "The original work has been modified.";
- to Distribute and Publicly Perform the Work including as incorporated in Collections; and,
- to Distribute and Publicly Perform Adaptations.
The above rights may be exercised in all media and formats whether now known or hereafter devised. The above rights include the right to make such modifications as are technically necessary to exercise the rights in other media and formats. Subject to Section 8(f), all rights not expressly granted by Licensor are hereby reserved, including but not limited to the rights described in Section 4(e).
4. Restrictions.
The license granted in Section 3 above is expressly made subject to and limited by the following restrictions:
- You may Distribute or Publicly Perform the Work only under the terms of this License. You must include a copy of, or the Uniform Resource Identifier (URI) for, this License with every copy of the Work You Distribute or Publicly Perform. You may not offer or impose any terms on the Work that restrict the terms of this License or the ability of the recipient of the Work to exercise the rights granted to that recipient under the terms of the License. You may not sublicense the Work. You must keep intact all notices that refer to this License and to the disclaimer of warranties with every copy of the Work You Distribute or Publicly Perform. When You Distribute or Publicly Perform the Work, You may not impose any effective technological measures on the Work that restrict the ability of a recipient of the Work from You to exercise the rights granted to that recipient under the terms of the License. This Section 4(a) applies to the Work as incorporated in a Collection, but this does not require the Collection apart from the Work itself to be made subject to the terms of this License. If You create a Collection, upon notice from any Licensor You must, to the extent practicable, remove from the Collection any credit as required by Section 4(d), as requested. If You create an Adaptation, upon notice from any Licensor You must, to the extent practicable, remove from the Adaptation any credit as required by Section 4(d), as requested.
- You may Distribute or Publicly Perform an Adaptation only under: (i) the terms of this License; (ii) a later version of this License with the same License Elements as this License; (iii) a Creative Commons jurisdiction license (either this or a later license version) that contains the same License Elements as this License (e.g., Attribution-NonCommercial-ShareAlike 3.0 US) ("Applicable License"). You must include a copy of, or the URI, for Applicable License with every copy of each Adaptation You Distribute or Publicly Perform. You may not offer or impose any terms on the Adaptation that restrict the terms of the Applicable License or the ability of the recipient of the Adaptation to exercise the rights granted to that recipient under the terms of the Applicable License. You must keep intact all notices that refer to the Applicable License and to the disclaimer of warranties with every copy of the Work as included in the Adaptation You Distribute or Publicly Perform. When You Distribute or Publicly Perform the Adaptation, You may not impose any effective technological measures on the Adaptation that restrict the ability of a recipient of the Adaptation from You to exercise the rights granted to that recipient under the terms of the Applicable License. This Section 4(b) applies to the Adaptation as incorporated in a Collection, but this does not require the Collection apart from the Adaptation itself to be made subject to the terms of the Applicable License.
- You may not exercise any of the rights granted to You in Section 3 above in any manner that is primarily intended for or directed toward commercial advantage or private monetary compensation. The exchange of the Work for other copyrighted works by means of digital file-sharing or otherwise shall not be considered to be intended for or directed toward commercial advantage or private monetary compensation, provided there is no payment of any monetary compensation in con-nection with the exchange of copyrighted works.
- If You Distribute, or Publicly Perform the Work or any Adaptations or Collections, You must, unless a request has been made pursuant to Section 4(a), keep intact all copyright notices for the Work and provide, reasonable to the medium or means You are utilizing: (i) the name of the Original Author (or pseudonym, if applicable) if supplied, and/or if the Original Author and/or Licensor designate another party or parties (e.g., a sponsor institute, publishing entity, journal) for attribution ("Attribution Parties") in Licensor's copyright notice, terms of service or by other reasonable means, the name of such party or parties; (ii) the title of the Work if supplied; (iii) to the extent reasonably practicable, the URI, if any, that Licensor specifies to be associated with the Work, unless such URI does not refer to the copyright notice or licensing information for the Work; and, (iv) consistent with Section 3(b), in the case of an Adaptation, a credit identifying the use of the Work in the Adaptation (e.g., "French translation of the Work by Original Author," or "Screenplay based on original Work by Original Author"). The credit required by this Section 4(d) may be implemented in any reasonable manner; provided, however, that in the case of a Adaptation or Collection, at a minimum such credit will appear, if a credit for all contributing authors of the Adaptation or Collection appears, then as part of these credits and in a manner at least as prominent as the credits for the other contributing authors. For the avoidance of doubt, You may only use the credit required by this Section for the purpose of attribution in the manner set out above and, by exercising Your rights under this License, You may not implicitly or explicitly assert or imply any connection with, sponsorship or endorsement by the Original Author, Licensor and/or Attribution Parties, as appropriate, of You or Your use of the Work, without the separate, express prior written permission of the Original Author, Licensor and/or Attribution Parties.
- For the avoidance of doubt:
- Non-waivable Compulsory License Schemes. In those jurisdictions in which the right to collect royalties through any statutory or compulsory licensing scheme cannot be waived, the Licensor reserves the exclusive right to collect such royalties for any exercise by You of the rights granted under this License;
- Waivable Compulsory License Schemes. In those jurisdictions in which the right to collect royalties through any statutory or compulsory licensing scheme can be waived, the Licensor reserves the exclusive right to collect such royalties for any exercise by You of the rights granted under this License if Your exercise of such rights is for a purpose or use which is otherwise than noncommercial as permitted under Section 4(c) and otherwise waives the right to collect royalties through any statutory or compulsory licensing scheme; and,
- Voluntary License Schemes. The Licensor reserves the right to collect royalties, whether individually or, in the event that the Licensor is a member of a collecting society that administers voluntary licensing schemes, via that society, from any exercise by You of the rights granted under this License that is for a purpose or use which is otherwise than noncommercial as permitted under Section 4(c).
- Except as otherwise agreed in writing by the Licensor or as may be otherwise permitted by applicable law, if You Reproduce, Distribute or Publicly Perform the Work either by itself or as part of any Adaptations or Collections, You must not distort, mutilate, modify or take other derogatory action in relation to the Work which would be prejudicial to the Original Author's honor or reputation. Licensor agrees that in those jurisdictions (e.g. Japan), in which any exercise of the right granted in Section 3(b) of this License (the right to make Adaptations) would be deemed to be a distortion, mutilation, modification or other derogatory action prejudicial to the Original Author's honor and reputation, the Licensor will waive or not assert, as appropriate, this Section, to the fullest extent permitted by the applicable national law, to enable You to reasonably exercise Your right under Section 3(b) of this License (right to make Adaptations) but not otherwise.
5. Representations, Warranties and Disclaimer
UNLESS OTHERWISE MUTUALLY AGREED TO BY THE PARTIES IN WRITING AND TO THE FULLEST EXTENT PERMITTED BY APPLICABLE LAW, LICENSOR OFFERS THE WORK AS-IS AND MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE WORK, EXPRESS, IMPLIED, STATUTORY OR OTHERWISE, INCLUDING, WITHOUT LIMITATION, WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, NONINFRINGEMENT, OR THE ABSENCE OF LATENT OR OTHER DEFECTS, ACCURACY, OR THE PRESENCE OF ABSENCE OF ERRORS, WHETHER OR NOT DISCOVERABLE. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OF IMPLIED WARRANTIES, SO THIS EXCLUSION MAY NOT APPLY TO YOU.
6. Limitation on Liability.
EXCEPT TO THE EXTENT REQUIRED BY APPLICABLE LAW, IN NO EVENT WILL LICENSOR BE LIABLE TO YOU ON ANY LEGAL THEORY FOR ANY SPECIAL, INCIDENTAL, CONSEQUENTIAL, PUNITIVE OR EXEMPLARY DAMAGES ARISING OUT OF THIS LICENSE OR THE USE OF THE WORK, EVEN IF LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
7. Termination
- This License and the rights granted hereunder will terminate automatically upon any breach by You of the terms of this License. Individuals or entities who have received Adaptations or Collections from You under this License, however, will not have their licenses terminated provided such individuals or entities remain in full compliance with those licenses. Sections 1, 2, 5, 6, 7, and 8 will survive any termination of this License.
- Subject to the above terms and conditions, the license granted here is perpetual (for the duration of the applicable copyright in the Work). Notwithstanding the above, Licensor reserves the right to release the Work under different license terms or to stop distributing the Work at any time; provided, however that any such election will not serve to withdraw this License (or any other license that has been, or is required to be, granted under the terms of this License), and this License will continue in full force and effect unless terminated as stated above.
8. Miscellaneous
- Each time You Distribute or Publicly Perform the Work or a Collection, the Licensor offers to the recipient a license to the Work on the same terms and conditions as the license granted to You under this License.
- Each time You Distribute or Publicly Perform an Adaptation, Licensor offers to the recipient a license to the original Work on the same terms and conditions as the license granted to You under this License.
- If any provision of this License is invalid or unenforceable under applicable law, it shall not affect the validity or enforceability of the remainder of the terms of this License, and without further action by the parties to this agreement, such provision shall be reformed to the minimum extent necessary to make such provision valid and enforceable.
- No term or provision of this License shall be deemed waived and no breach consented to unless such waiver or consent shall be in writing and signed by the party to be charged with such waiver or consent.
- This License constitutes the entire agreement between the parties with respect to the Work licensed here. There are no understandings, agreements or representations with respect to the Work not specified here. Licensor shall not be bound by any additional provisions that may appear in any communication from You. This License may not be modified without the mutual written agreement of the Licensor and You.
- The rights granted under, and the subject matter referenced, in this License were drafted utilizing the terminology of the Berne Convention for the Protection of Literary and Artistic Works (as amended on September 28, 1979), the Rome Convention of 1961, the WIPO Copyright Treaty of 1996, the WIPO Performances and Phonograms Treaty of 1996 and the Universal Copyright Convention (as revised on July 24, 1971). These rights and subject matter take effect in the relevant jurisdiction in which the License terms are sought to be enforced according to the corresponding provisions of the implementation of those treaty provisions in the applicable national law. If the standard suite of rights granted under applicable copyright law includes additional rights not granted under this License, such additional rights are deemed to be included in the License; this License is not intended to restrict the license of any rights under applicable law.
Appendix C. Further Readings
While the goal of this guide is to provide a comprehensive and extensive documentation on autotools and all tools related to them, it will take a long time before it can be said to be self-sufficient.
Until that time, I'd like to point out some further readings that will provide pointers, or reference material, that can be useful to those of you interested in working with autotools.
Some of the material (most at the time I'm writing this) is actually content I have written for various different targets, and that is available online already. If compatible with the goal of this guide and the license of the published content, it will be merged in this guide, in due time.
[AutoconfManual] GNU autoconf manual .
[AutomakeManual] GNU automake manual .
[LibtoolManual] GNU libtool manual .
[CalcoteAutotools] Autotools. A Practical Guide To GNU Autoconf, Automake, And Libtool. John Calcote. No Starch Press. July 2010.
[AutotoolsBestPractices] Best Practices With Autotools . Diego E. "Flameeyes" Pettenò. August 2005.
[GentooAutomagic] Automagic dependencies, what they are and how to fix them . Diego E. "Flameeyes" Pettenò. et al. .
[GentooAsNeeded] --as-needed introduction and fixing guide . Diego E. "Flameeyes" Pettenò. et al. .
[MillerRecursiveMake] Recursive Make Considered Harmful . http://miller.emu.id.au/pmiller/books/rmch/ . Peter Miller. 1997.
Appendix D. Examples
Table of Contents
1. From Section 7.1, "Why Caching is Not What You're Looking For"
Example D.1. configure.c.ac
AC_INIT
AC_PROG_CC
AC_CHECK_HEADERS([thisonlyworksonc99.h])
AC_OUTPUT
Example D.2. configure.c99.ac
AC_INIT
AC_PROG_CC
AC_PROG_CC_C99
AC_CHECK_HEADERS([thisonlyworksonc99.h])
AC_OUTPUT
Example D.3. thisonlyworksonc99.h
#if __STDC_VERSION__ != 199901L
# error "Sorry this won't work"
#endif
Autotools Mythbuster的更多相关文章
- Github上的1000多本免费电子书重磅来袭!
Github上的1000多本免费电子书重磅来袭! 以前 StackOverFlow 也给出了一个免费电子书列表,现在在Github上可以看到时刻保持更新的列表了. 瞥一眼下面的书籍分类目录,你就能 ...
- Github 的一个免费编程书籍列表
Index Ada Agda Alef Android APL Arduino ASP.NET MVC Assembly Language Non-X86 AutoHotkey Autotools A ...
- 在 Linux 中使用 Eclipse 和 Gnu Autotools 管理 C/C++ 项目
在我该系列的之前的所有随笔中,都是采用 Linux 发行版自带的包管理工具(如 apt-get.yum 等)进行软件的安装和卸载,从来没有向大家展示使用源代码自行编译安装软件的方法.但是长期混迹于 U ...
- C/C++ makefile自动生成工具(comake2,autotools,linux),希望能为开源做点微薄的贡献!
序 在linux下C或C++项目开发,Makefile是必备的力气,但是发现手写很麻烦. 在百度有个comake2工具,用于自动生成Makefile工具,而在外边本想找一个同类工具,但发现 ...
- 使用autotools系列工具自动部署源代码编译安装
在Linux系统下开发一个较大的项目,完全手动建立Makefile是一件费力而又容易出错的工作.autotools系列工具只需用户输入简单的目标文件.依赖文件.文件目录等就可以比较轻松地生成Makef ...
- Linux Autotools
/********************************************************************** * Linux Autotools * 说明: * 我们 ...
- GNU Autotools的研究(转)
最近对Linux下软件项目的构建过程研究了一番.Linux下的软件项目通常用Autotools工具集和make工具来构建,我们通常使用./configure.make.make install这样的命 ...
- 如何使用autotools生成Makefile
安装autotools工具sudo apt-get install autoconf 一,四个代码文件init.s lcd.c addr.h uart.c 二,命令:autoscan 三,命令:vi ...
- Makefile自动生成工具-----autotools的使用(详细)
相信每个学习Linux的人都知道Makefile,这是一个很有用的东西,但是编写它是比较复杂,今天介绍一个它的自动生成工具,autotools的使用.很多GNULinux的的软件都是用它生成Makef ...
随机推荐
- CodeForces 547E Mike and Friends AC自动机 主席树
题意: 给出\(n\)个字符串\(s_i\)和\(q\)个询问: \(l,r,k\):\(\sum\limits_{i=l}^{r}count(i, k)\),其中\(count(i,j)\)表示\( ...
- CSS的z-index & 绝对定位与相对定位
1.在有些情况下,需要仔细地控制元素在网页中堆叠顺序.z-index样式属性让你能够设置元素的堆叠顺序. 堆叠元素时,z-index值较大的元素在z-index值较小的下面. 2.z-index值仅在 ...
- Django 博客开发教程目录索引
Django 博客开发教程目录索引 本项目适合 0 基础的 Django 开发新人. 项目演示地址:Black & White,代码 GitHub 仓库地址:zmrenwu/django-bl ...
- Github上最受关注的前端大牛 快来膜拜把!
Github上最受关注的前端大牛 快来膜拜吧! 来源:csdn 发布时间:2014-08-06 阅读次数:4058 14 本文列出了Github上最受关注的10位前端大牛.看看他们负责的项目和提交 ...
- 数据库路由中间件MyCat - 源代码篇(2)
此文已由作者张镐薪授权网易云社区发布. 欢迎访问网易云社区,了解更多网易技术产品运营经验. 2. 前端连接建立与认证 Title:MySql连接建立以及认证过程client->MySql:1.T ...
- c语言在windows下和Mac下的不同表现!
最近给一个等级考试的C语言培训班上课,学生问起一些++的问题.让我好生为难.因为这些不同的编译器处理方式,在不同的系统下表现并不一致. 不管你洋洋洒洒论述多么一大篇,在事实面前就一下显得苍白了.虽然这 ...
- Webdriver--获得验证信息
title:获得当前页面的标题 current_url:获得当前页面的URL text:前面提到过,获得标签对的文本信息 try: couseTitle = driver.find_element_b ...
- ASP.NET Core 认证与授权[1]:初识认证 (笔记)
原文链接: https://www.cnblogs.com/RainingNight/p/introduce-basic-authentication-in-asp-net-core.html 在A ...
- 孤荷凌寒自学python第三十六天文件内容的迭代操作
孤荷凌寒自学python第三十六天python的文件操作对文件内容的迭代操作 (完整学习过程屏幕记录视频地址在文末,手写笔记在文末) 一.os模块的其它文件操作方法补充 1 os.remove(文件 ...
- google protobuf 中的proto文件编写规则
1. 简单介绍 protobuf文件:就是定义你要的消息(类似Java中的类)和消息中的各个字段及其数据类型(类似java类中的成员变量和他的数据类型) 2. Protobuf消息定义 消息由至少一个 ...