Eldev (Elisp development tool) is an Emacs-based build tool, targeted solely at Elisp projects. It is an alternative to Cask. Unlike Cask, Eldev itself is fully written in Elisp and its configuration files are also Elisp programs. If you are familiar with Java world, Cask can be seen as a parallel to Maven — it uses project description, while Eldev is sort of a parallel to Gradle — its configuration is a program on its own.
Brief overview
Eldev features:
-
Eldev configuration is Elisp. It can change many defaults, add special cases for Emacs versions and much more — even define additional Eldev commands and options.
-
Built-in support for regression/unit testing.
-
Blends nicely into continuous integration setups.
-
Can run on different Emacs version even on the same machine; can also use Docker or Podman for that.
-
There are four levels of configuration — you can customize most aspects of Eldev for your project needs and personal preferences.
-
Project dependency downloading, installation etc. is fully automated, you only need to specify which Elisp package archive(s) to use.
-
You can also use local dependencies, even those that don’t use Eldev (some restrictions still apply). This is similar to Cask linking, but with more flexibility.
-
Full support for autoloads during development.
-
Miscellaneous operations useful during development: running Emacs with only your project, linting source code, evaluating expressions in project’s context, profiling.
-
Can automate release process for your project.
-
Eldev by default isolates your project for development, helping you to distinguish between problems with setup or configuration and inherent bugs in the project.
-
Full-featured build system for complex projects.
-
Runs on all major operating systems: Linux, macOS, Windows.
-
Eldev is fast.
Drawbacks:
-
Eldev doesn’t run the project being tested/built in a separate process, so it is not as pure as Cask. However, Emacs packages won’t live in a sterile world anyway: typical user setup will include dozens of other packages.
-
Eldev depends much more on Emacs internals. It is more likely to break with future Emacs versions than Cask.
Eldev is not widely used, but as of August 2022 there are
around 120 projects on GitHub that include file
Eldev
, so it is already quite well tested in the wild.
Additionally, Eldev contains a fairly large regression test
collection.
If you are using Flycheck or Flymake, check out flycheck-eldev or, correspondingly, flymake-eldev package. They provide integration between Flycheck/Flymake and Eldev, allowing the former to automatically use proper dependencies in Eldev projects. |
Example projects
Here is a non-exhaustive list of projects that use Eldev and can serve as examples. I intentionally list only my own projects, even if there are others, because this way it’s easier to ensure that the comments below stay valid. Eldev source code itself comes with no examples: I think real-world usage provides better models.
extmap
; its fileEldev
; its file.github/workflows/test.yml
-
A simple project with no dependencies. As you can see, there is nothing in its
Eldev
. The file is actually not even needed, it is only there to signify that Eldev can be used on the project and for some tools (flycheck-eldev, flymake-eldev, Projectile). iter2
; its fileEldev
; its file.github/workflows/test.yml
-
Another simple project with no dependencies. However, it uses its file
Eldev
to define a custom option that activates project-specific development assistance code. Additionally, it enables undercover plugin to collect test code coverage statistics. - Logview; its file
Eldev
; its file.github/workflows/test.yml
-
This project has several dependencies, so it needs to instruct Eldev how to find them.
datetime
; its fileEldev
; its file.github/workflows/test.yml
-
A library with a fairly complicated file
Eldev
. The main reason for complexity are two included Java programs that are used for 1) extracting information from Java core libraries; and 2) comparingdatetime
’s results against a Java implementation during testing. It also usesextmap
to generate resource files that are later included in its package.
All these projects also use continuous
integration on GitHub for automated testing.
Various elements of files Eldev
in these projects are documented
below.
Requirements
Eldev runs on Emacs 24.4 and up. On earlier Emacs versions it will be overly verbose, but this is rather an Emacs problem.
Any “typical” OS — Linux, macOS, Windows or any POSIX-like system not
listed earlier — will do. Additionally, since there is only a small
shell script (.bat
file for Windows) that is really OS-dependent,
porting to other systems should not be difficult, volunteers welcome.
Eldev intentionally has no dependencies, at least currently: otherwise your project would also see them, which could in theory lead to some problems.
Installation
There are several ways to install Eldev.
Bootstrapping from MELPA: if you have a catch-all directory for executables
-
On Linux, macOS, etc.:
-
From this directory (e.g.
~/bin
) execute:$ curl -fsSL https://raw.github.com/emacs-eldev/eldev/master/bin/eldev > eldev && chmod a+x eldev
You can even do this from
/usr/local/bin
provided you have the necessary permissions.
-
-
On Windows:
-
From this directory (e.g.
%USERPROFILE%\bin
) execute:> curl.exe -fsSL https://raw.github.com/emacs-eldev/eldev/master/bin/eldev.bat > eldev.bat
-
No further steps necessary — Eldev will bootstrap itself as needed on first invocation.
Bootstrapping from MELPA: general case
-
On Linux, macOS, etc.:
-
Execute:
$ curl -fsSL https://raw.github.com/emacs-eldev/eldev/master/webinstall/eldev | sh
This will install
eldev
script to~/.local/bin
. -
Generally, this directory should already be in your
PATH
. But if not, e.g. in~/.profile
add this:export PATH="$HOME/.local/bin:$PATH"
-
-
On Windows:
-
Execute:
> curl.exe -fsSL https://raw.github.com/emacs-eldev/eldev/master/webinstall/eldev.bat | cmd /Q
This will install
eldev.bat
script to%USERPROFILE%\.local\bin
. -
Add this directory to your
PATH
:> reg add HKCU\Environment /v Path /d "%USERPROFILE%\.local\bin;%PATH%" /f
-
Afterwards Eldev will bootstrap itself as needed on first invocation.
eldev doesn’t really need to be findable through PATH — it
will work regardless. This is rather for your convenience, so that
you don’t need to type the full path again and again.
|
Installing from sources
-
Clone the source tree from GitHub.
-
In the cloned working directory execute,
-
on Linux, macOS, etc.:
$ ./install.sh DIRECTORY
-
on Windows:
> install.bat DIRECTORY
-
Here DIRECTORY
is the location of eldev
executable should be put.
It should be in PATH
environment variable, or else you will need to
specify full path each time you invoke Eldev. You probably have
sth. like ~/bin
in your PATH
already, which would be a good value
for DIRECTORY
. You could even install in e.g. /usr/local/bin
—
but make sure you have permissions first.
Mostly for developing Eldev itself
-
Clone the source tree from GitHub.
-
Set environment variable
ELDEV_LOCAL
to the full path of the working directory. -
Make sure executable
eldev
is available. Either follow any of the first way to install Eldev, or symlink/copy filebin/eldev
from the cloned directory to somewhere on yourPATH
.
Now each time Eldev is executed, it will use the sources at
ELDEV_LOCAL
. You can even modify it and see how that affects Eldev
immediately.
Upgrading Eldev
Eldev bootstraps itself when needed, but won’t automatically fetch new versions. To upgrade it later, explicitly run (from any directory):
$ eldev upgrade-self
By default it uses MELPA Stable. If you want to test or use some not yet officially released version, try:
$ eldev --unstable upgrade-self
This will make it use MELPA Unstable for upgrading. If you want to
switch back to the latest stable version (as recommended), supply -d
(--downgrade
) option to the command:
$ eldev upgrade-self -d
Safety concerns
In general, it is not recommended to execute Eldev, GNU Make, Scons, any other build tool or anything based on one in a directory that contains untrusted code. |
Like many (if not most) other development tools, Eldev is unsafe when
executed on untrusted code. For example, simply running eldev
in a
project you have just downloaded from hackerden.org
can result in
anything, including emptied home directory. For that matter, running
make
or gradle
is not better in this regard. Eldev is perhaps a
bit more dangerous, because even eldev help
reads file Eldev
,
thus executing arbitrary code.
Even seemingly harmless things, like opening a .el
file in Emacs can
lead to unforeseen consequences. If you e.g. have
Flycheck or Flymake enabled everywhere,
this will result in byte-compiling said file, which also can execute
arbitrary code, for example using (eval-when-compile …)
form. The
same holds for installing (not even using!) Elisp packages.
Only use build tools on code that you trust. Better yet, don’t even touch code that you don’t plan running.
Getting started
Eldev comes with built-in help. Just run:
$ eldev help
This will list all the commands Eldev supports. To see detailed description of any of those, type:
$ eldev help COMMAND
In the help you can also see lots of options — both global and specific to certain commands. Many common things are possible just out of the box, but later we will discuss how to define additional commands and options or change defaults for the existing.
Two most important global options to remember are --trace
(-t
) and
--debug
(-d
). With the first one, Eldev prints lots of additional
information about what it is doing to stdout. With the second, Eldev
prints stacktraces for most errors. These options will often help you
figure out what’s going wrong without requesting any external
assistance. Also check out section on various
debugging features discussed later.
Eldev mostly follows GNU conventions in its command line. Perhaps the only exception is that global options must be specified before command name and command-specific options — after it.
Initializing a project
When Eldev starts up, it configures itself for the project in the
directory where it is run from. This is done by loading Elisp file
called Eldev
(without extension!) in the current directory. This
file is similar to Make’s Makefile
or Cask’s Cask
. But even more
so to Gradle’s build.gradle
: because it is a program. File Eldev
is not strictly required, but nearly all projects will have one. It
is also generally recommended to create it even if empty, because
otherwise some tools (e.g. flycheck-eldev,
flymake-eldev, Projectile) will
not recognize the project as Eldev-based without it.
You can create the file in your project manually, but it is easier to just let Eldev itself do it for you, especially the first time:
$ eldev init
If you let the initializer do its work, it will create file Eldev
already prepared to download project dependencies. If you answer “no”
to its question (or execute as eldev init --non-interactive
), just
edit the created file and uncomment some of the calls to
eldev-use-package-archive
there as appropriate. These forms
instruct Eldev to use specific package archives to download project
dependencies.
After this step, Eldev is ready to work with your project.
Initializing a checkout
If your project uses Git hooks, you can take
advantage of Eldev’s command githooks
to install them:
$ eldev githooks Installed as symbolic link: ‘.git/hooks/pre-commit’ -> ‘githooks/pre-commit’
Eldev doesn’t try to enforce any hooks on you, instead, it only
symlinks the files the project has in directory githooks
to the
place Git expects them. For security reasons, Git doesn’t activate
any hooks by default upon repository cloning, this has to be an
explicit decision — be that copying by hands or an Eldev invocation.
If you don’t have such a subdirectory, command will not do anything
and will also not be shown in the output of eldev help
.
Setup procedure in details
Now that we have created file Eldev
, it makes sense to go over the
full startup process:
-
Load file
~/.config/eldev/config
-
Load file
Eldev
in the current directory -
Load file
Eldev-local
in the current directory -
Execute setup forms specified on the command line
None of these Elisp files and forms are required. They are also not restricted in what they do. However, their intended usage is different.
File ~/.config/eldev/config
is user-specific. It is meant mostly
for customizing Eldev to your personal preferences. For example, if
you hate coloring of Eldev output, add form (setf eldev-coloring-mode
nil)
to it. Then every Eldev process started for any project will
default to using uncolored output.
More precisely, if directory ~/.eldev exists (for
pre-0.11 Eldev installations), file ~/.eldev/config is used.
Otherwise, if environment variable XDG_CONFIG_HOME is
set, file $XDG_CONFIG_HOME/eldev/config is used. And finally,
~/.config/eldev/config is the default fallback.
|
File Eldev
is project-specific. It is the only configuration file
that should be added to project’s VCS (Git, Mercurial, etc.). Typical
usage of this file is to define in which package archives to look up
dependencies. It is also the place to define project-specific
builders and commands, for example to build project documentation from
source.
File Eldev-local
is working directory or user/project-specific.
Unlike Eldev
, it should not be added to VCS: it is meant to be
created by each developer (should he want to do so) to customize how
Eldev behaves in this specific directory. The most common use is to
define local dependencies. A good practice is to instruct your VSC to
ignore this file, e.g. list it in .gitignore
for Git.
Finally, it is possible to specify some (short) setup forms on the
command line using --setup
(-S
) option. This is not supposed to
be used often, mostly in cases where you run Eldev on a use-once
project checkout, e.g. on a continuous
integration server.
Project isolation
Eldev tries to create a self-contained environment for building and testing your project. It will isolate your project as much as possible from your “normal” Emacs, i.e. the one that you use for editing. This is done to avoid interference from your other installed packages or configuration, to prevent broken and misbehaving projects from affecting your Emacs and, finally, to simplify testing of certain “permanent effect” features, like customizing variables.
-
Packages installed in your Emacs (usually in
~/.emacs.d/elpa/
) are not visible for projects built with Eldev. Likewise, dependencies installed for such projects will not appear in your normal Emacs. -
Variable
user-emacs-directory
will point somewhere inside.eldev
in the project’s directory rather than to~/.emacs.d
. This also means thatlocate-user-emacs-file
will not find files in your normal configuration directory. If you want to undo this change (e.g. in fileEldev
orEldev-local
), use original value of the variable stored aseldev-real-user-emacs-directory
. -
Eldev supports executing on different Emacs version for the same project without any additional steps.
Using preinstalled dependencies
Starting with version 0.8 you can opt out of some of the
default project isolation features and use preinstalled dependencies,
e.g. those from your normal Emacs. To activate this mode, use global
option --external
(-X
), e.g.:
$ eldev -X test
In this mode Eldev will expect dependencies to be installed in given
directory (standard Emacs location — ~/.emacs.d/elpa
— is only the
default: you can use another directory). If a dependency is not
installed, Eldev will not install it on its own: it doesn’t know
which package archives should be used. Likewise, it will not upgrade
anything. In all such cases, i.e. when required dependencies are not
correctly preinstalled in the specified external directory, Eldev will
simply fail.
Local dependencies discussed later take precedence even in this mode: anything declared as local will override dependencies available from an external directory, just like it will in usual full isolation mode.
This mode can be useful to load exactly the same dependency versions
as those installed in your normal Emacs. However, it is not suitable
for continuous integration or for working on packages that you do not
have — for whatever reason — installed normally. It is also difficult
to test on different Emacs versions in
external directory mode. Therefore, it is not the default. But, as
usual in Eldev, you can make it the default in file
~/.config/eldev/config
if you want.
There is also a way to disable dependency management completely. However, other than in few very special cases you should prefer normal operation.
Project dependencies
Eldev picks up project dependencies from package declaration,
i.e. usually from Package-Requires
header in the project’s main
.el
file. If you have several files with package headers in the the
root directory, you need to set variable eldev-project-main-file
,
else function package-dir-info
can pick a wrong one. In any case,
you don’t need to declare these dependencies second time in Eldev
and keep track that they remain in sync.
However, you do need to tell Eldev how to find these dependencies.
Like Cask, by default it doesn’t use any package archives. To tell it
to use an archive, call function eldev-use-package-archive
in
Eldev
(you have such forms already in place if you have used eldev
init
). For example:
(eldev-use-package-archive 'melpa)
Eldev knows about three “standard” archives, which should cover most
of your needs: gnu-elpa
, nongnu-elpa
and melpa
.
You can also explicitly choose stable (a.k.a. release: gnu
, nongnu
and melpa-stable
) or unstable (a.k.a. snapshot or development:
gnu-devel
, nongnu-devel
and melpa-unstable
) variants instead.
There are some naming inconsistencies here, because for GNU and NonGNU
only the stable variants had been added at first (nongnu
only in
0.10), nonstable (development) were added later, in 1.1.
A better way is provided by two global options: --stable
(the default) and --unstable
. Normally, Eldev will try to install
everything from stable archive variants (you wouldn’t want your tests
fail only because a dependency in an unstable version has a bug).
However, if a package is not available (at all or in the required
version) from the stable archive, unstable will be used automatically.
If you specify --unstable
on the command line, Eldev will behave in
the opposite way: prefer the unstable archive and use the stable only
as a fallback.
Emacs 25 and up supports package archive priorities. Eldev backports this to Emacs 24 and utilizes the feature to assign the standard archives it knows about priorities 300/190 (GNU ELPA, stable/unstable variants), 250/150 (NonGNU ELPA), and 200/100 (MELPA). A dependency from a package with a lower priority is installed only if there are no other options.
If dependencies for your project are only available from some other archive, you can still use the same function. Just substite the symbolic archive name with a cons cell of name and URL as strings:
(eldev-use-package-archive '("myarchive" . "http://my.archive.com/packages/"))
You don’t need to perform any additional steps to have Eldev actually
install the dependencies: any command that needs them will make sure
they are installed first. However, if you want to check if package
archives have been specified correctly and all dependencies can be
looked up without problems, you can explicitly use command prepare
.
Local dependencies
Imagine you are developing more than one project at once and they depend on each other. You’d typically want to test the changes you make in one of them from another right away. If you are familiar with Cask, this is solved by linking projects in it.
Eldev provides a more flexible approach to this problem called local
dependencies. Let’s assume you develop project foo
in directory
~/foo
and also a library called barlib
in ~/barlib
. And foo
uses the library. To have Eldev use your local copy of barlib
instead of downloading it e.g. from MELPA, add the following form in
file ~/foo/Eldev-local
:
(eldev-use-local-dependency "~/barlib")
Note that the form must not be added to Eldev
: other developers
who check out your project probably don’t even have a local copy of
barlib
or maybe have it in some other place. In other words, this
should really remain your own private setting and go to Eldev-local
.
Local dependencies have loading modes, just as the project’s package itself. Those will be discussed later.
Eldev correctly handles situations with changing definitions of local
dependencies. I.e. by simply commenting out or uncommenting
eldev-use-local-dependency
call, you can quickly test your project
both with a MELPA-provided package and with a local dependency — Eldev
will adapt without any additional work from you.
Additional dependencies
It is possible to register additional dependencies for use only by
certain Eldev commands. Perhaps the most useful is to make certain
packages available for testing purposes. For example, if your project
doesn’t depend on package foo
on its own, but your test files do,
add the following form to Eldev
file:
(eldev-add-extra-dependencies 'test 'foo)
Additional dependencies are looked up in the same way as normal ones. So, you need to make sure that all of them are available from the package archives you instructed Eldev to use.
The following commands make use of additional dependencies: build
,
emacs
, eval
, exec
and test
. Commands you define yourself can
also take advantage of this mechanism, see function
eldev-load-project-dependencies
.
Extended dependency format
Normally to specify an additional dependency you just need to provide its package name as a symbol. However, Eldev also supports “extended” format, that lets you specify other details. In this format, dependency is specified as a property list (plist):
(:package DEPENDENCY-NAME
:version REQUIRED-VERSION
:archive PACKAGE-ARCHIVE
:archives (PACKAGE-ARCHIVE...)
:optional OPTIONAL)
All keywords except :package
can be omitted. In the extended format
you can specify which version of the dependency is required (normally,
any version will do) and which package archive(s) to use (by default,
the same archives as for normal dependencies are used). In values
associated with :archive
/:archives
standard shortcuts gnu
(for
GNU ELPA) and melpa
(for MELPA; also melpa-stable
and
melpa-unstable
) can be used. Dependencies can also be marked as
optional, see the next subsection.
There is also a special format for referring to certain
tools like Buttercup: (:tool TOOL-NAME)
. For details,
refer to section Development tool
sources.
Optional additional dependencies
Suppose you want to test your project’s integration with a
third-party package, but don’t strictly need it. And, additionally,
relevant tests are written in such a way as to simply be skipped if
said package is not available, e.g. using ert-skip
or
buttercup-skip
. In this case you may want to declare the package as
an optional additional dependency, so that you don’t need to care if
it can be installed during continuous integration or not:
(eldev-add-extra-dependencies 'test '(:package helm :optional t))
In this example, we declare that we want Helm for testing, but don’t care much if it cannot be installed, e.g. because of too old Emacs version.
Examining dependencies
Sometimes it is useful to check what a project depends on, especially if it is not your project, just something you have checked out. There are two commands for this in Eldev.
First is dependencies
(can be shortened to deps
). It lists
direct dependencies of the project being built. By default, it
omits any built-in packages, most importantly emacs
. If you want to
check those too, add option -b
(--list-built-ins
).
Second is dependecy-tree
(short alias: dtree
). It prints a tree
of project direct dependencies, direct dependencies of those, and so
on — recursively. Like with the first command, use option -b
if you
want to see built-ins in the tree.
Both commands can also list additional dependencies if instructed: just specify set name(s) on the command line, e.g.:
$ eldev dependencies test
You can also check which archives Eldev uses to look up dependencies for this particular project with the following command:
$ eldev archives
Upgrading dependencies
Eldev will install project dependencies automatically, but it will never upgrade them, at least if you don’t change your project to require a newer version. However, you can always explicitly ask Eldev to upgrade the installed dependencies:
$ eldev upgrade
First, package archive contents will be refetched, so that Eldev knows about newly available versions. Next, this command upgrades (or installs, if necessary) all project dependencies and all additional dependencies you might have registered (see above). If you don’t want to upgrade everything, you can explicitly list names of the packages that should be upgraded:
$ eldev upgrade dash ht
You can also check what Eldev would upgrade without actually upgrading anything:
$ eldev upgrade --dry-run
If you use MELPA for looking up dependencies, you can switch between Stable and Unstable using global options with the same name, i.e.:
$ eldev --unstable upgrade
Because of the incompatible version numbers that MELPA Unstable
supplies, you cannot directly “upgrade” from an unstable version back
to a stable one. But you can specify option -d
(--downgrade
) to
the command:
$ eldev --stable upgrade -d
In this case Eldev will downgrade dependencies if this allows it to
use more preferable package archive. (Since --stable
is the
default, specifying it in the command above is not really needed, it’s
only mentioned for clarity.)
To install unstable version of only a specific dependency, while
leaving all others at stable versions, combine --unstable
with
listing package names after the command, e.g.:
$ eldev --unstable upgrade dash
Upgrading development tools
Command upgrade
works not only with package
dependencies, but also with common development tools used by the
project during development, for example Buttercup or
various linters. This works exactly the same as for
project dependencies, with the only exception that the tool must be
installed first. E.g., for Buttercup you need to test
your project at least once, so that Eldev knows about the need for
this tool.
Development tools are installed from package archives hardcoded inside
Eldev (but see the next section),
regardless of which archives you have configured for your project.
For example, even if you use melpa-unstable
archive, Buttercup will
still be installed from MELPA Stable (unless, of course, you use
--unstable
global option). If you need, you can switch to unstable
version of the tool later:
$ eldev --unstable upgrade buttercup
Development tool sources
Eldev knows how to install certain development tools and also uses predefined package archives for this, not the ones you specify in project’s configuration. This means you don’t need to list archives for tools like Buttercup: only list them if they are needed to look up real dependencies.
There is a simple way to customize where exactly Eldev
finds the tools: use variable eldev-known-tool-packages
for this.
The value of the variable is an alist keyed by tool names and
containing package descriptor plists as
values. By default it already contains information about the tools
Eldev knows about. You can add more or replace existing ones if you
need: just push
more entries at the beginning of the list, there is
no need to actually remove anything.
You can also use the tools as e.g. runtime dependencies if needed
(though in most cases you should leave this to Eldev). Just specify
package plist as (:tool TOOL-NAME)
for this. Both tools with
built-in support and any new you add to eldev-known-tool-packages
can be referred this way.
Current list of the known tools:
-
buttercup
-
ecukes
-
package-lint
-
relint
-
elisp-lint
-
undercover
Disabling dependency management
This operation mode is not recommended. It exists only to
support special usecases that “insist” on setting Emacs’ load-path
directly and cannot be customized (with reasonable effort).
|
Eldev has limited support for operating without
dependency management. This mode can be activated using global
option --disable-dependencies
(there is no short version to
emphasize that it is not recommended). It exists to support certain
environments that themselves provide a suitable value for Emacs
variable load-path
via environment variable EMACSLOADPATH
. An
example of such an environment is GUIX package building.
Unlike in preinstalled-dependency mode
described earlier, here Eldev doesn’t work with dependencies (and
tools) as standard Emacs packages at all.
Instead, everything is expected to be loadable using require
form
without any further setup. For this, variable load-path
must be set
appropriately, most likely using EMACSLOADPATH
(though you could, in
principle, set its value in e.g. Eldev-local
).
One consequence of this mode is that Emacs package system won’t
consider dependency packages installed at all, see
package-installed-p
. Autoloads are not supported in
this mode (neither for dependencies nor for the project itself), so
you must explicitly require all features before using them. There
might be other, unexpected, limitations as well, as this mode is not
thoroughly tested.
Unless you need to build packages for GUIX or have some comparable
environment that computes load-path
on its own and doesn’t let Eldev
manage dependencies normally by accessing standard package archives,
you shouldn’t use this mode.
Global package archive cache
To avoid downloading the same packages repeatedly, Eldev
employs a package archive cache. This cache is shared between all
projects and all Emacs versions on your
machine. It can significantly speed up package preparation if you use
a new project, test it on another Emacs version or delete
project-specific cache (subdirectory .eldev
) for whatever reason.
By default, downloaded packages stay cached indefinitely, while
archive contents expires in one hour. However, if you use command
upgrade
or upgrade-self
, package archive contents is always
refreshed.
Cache usage is not controllable from command line. However, you can
customize it somewhat in ~/.config/eldev/config
. Variable
eldev-enable-global-package-archive-cache
lets you disable the
global cache outright. Using
eldev-global-cache-archive-contents-max-age
you can adjust how long
cached copies of archive-contents
stay valid.
Loading modes
In Eldev the project’s package and its local dependencies have loading modes. This affects exactly how the package (that of the project or of its local dependency) becomes loadable by Emacs.
Default loading mode is called as-is
. It means the directory where
project (or local dependency) is located is simply added to Emacs
varible load-path
and normal Emacs loading should be able to find
required features from there on. This is the fastest mode, since it
requires no preparation and in most cases is basically what you want
during development.
However, users won’t have your project loaded like that. To emulate
the way that most of the people will use it, you can use loading mode
packaged
. In this mode, Eldev will first build a package out of
your project (or local dependency), then install and activate it using
Emacs’ packaging system. This is quite a bit slower than as-is
,
because it involves several preparation steps. However, this is
almost exactly the way normal users will use your project after
e.g. installing it from MELPA. For this reason, this mode is
recommended for continuous integration and
other forms of automated testing.
Other modes include byte-compiled
and source
. In these modes
loading is performed just as in as-is
mode, but before that Eldev
either byte-compiles everything or, vice-versa, removes .elc
files.
Loading mode compiled-on-demand
is useful primarily to
larger projects that include some computation-intensive code which
needs to be byte-compiled to run in reasonable time. In this mode,
only when a project file is loaded, e.g. via require
form, it is
byte-compiled if needed. In contrast, in mode byte-compiled
all
Elisp files are byte-compiled before a command (e.g. test
) even gets
to start working. This makes the mode useful during work on the
project “core” because 1) compilation time is reduced; 2) you can test
or otherwise use the core without even updating higher-level files to
be successfully compilable first. However, as a drawback, compilation
can run “in the middle” of project’s real code, which can occasionally
cause unforeseen troubles, as well as screw up
profiling.
Mode noisy-compiled-on-demand
is basically the same,
with the only exception that Eldev prints a message (the same as
during normal compilation) when it decides to recompile something. In
comparison, compiled-on-demand
will only write to stderr and only
if there are compilation warnings or errors. Since having “random”
text inserted in normal program output is potentially disrupting,
especially if said output is processed by another tool, this is not
the default and you have to actively choose between
compiled-on-demand
and noisy-compiled-on-demand
.
When using commands exec
and eval
with
compiled-on-demand
mode, you may want to use option -R
(--dont-require
) and then manually require
only the necessary
features, to reduce the set of files that have to be (re)compiled.
So, after discussing various loading modes, let’s have a look at how exactly you tell Eldev which one to use.
For the project itself, this is done from the command line using
global option --loading
(or -m
) with its argument being the name
of the mode. Since this is supposed to be used quite frequently,
there are also shortcut options to select specific modes: --as-is
(or -a
), --packaged
(-p
), --source
(-s
), --byte-compiled
(-c
) and --compiled-on-demand
(-o
). For example, the following
command will run unit-tests in the project, having it loaded as an
Emacs package:
$ eldev -p test
Remember, that as everything in Eldev, this can be customized.
E.g. if you want to run your project byte-compiled by default, add
this to your Eldev-local
:
(setf eldev-project-loading-mode 'byte-compiled)
For local dependencies the mode can be chosen when calling
eldev-use-local-dependency
. For example:
(eldev-use-local-dependency "~/barlib" 'packaged)
As mentioned above, loading mode defaults to as-is
.
There are a few other loading modes useful only for certain projects. You can always ask Eldev for a full list:
$ eldev --list-modes
Indirect build information
When a loading mode require Eldev to do something in order to prepare the project or its local dependencies for loading, it tries to do so silently in that only stderr is normally displayed. The purpose is to prevent secondary and partially unpredictable (more precisely, depending on previous builds) output from interfering with normal output. For example, if you run
$ eldev eval "(some-project-function)"
it might be confusing if the first line of output is instead
ELC some-file.el
if the loading mode is byte-compiled
and some-file.elc
doesn’t
exist or is out-of-date. In particular, if such output is then parsed
using some automated tool, this could lead to unexpected errors.
However, if this is not a concern in your case, you may
want to set variable eldev-display-indirect-build-stdout
to t. This
is especially useful if your project’s loading mode is built
and it
involves some custom non-trivial build steps, like e.g. compilation of
a helper non-Elisp program.
Unlike with some other settings, the main project ignores values in
local dependencies. Instead, eldev-display-indirect-build-stdout
as
defined in the main project affects both the project itself and all
local dependencies at once: only the main project “knows” if it is
important to avoid indirect build output for it or not.
Project source directory
Usually Elisp projects contain their source files directly
in the root directory. For smaller projects with one or a few files
this is the most convenient setup. Eldev assumes this is as the
default and effectively just adds project directory to load-path
when making project’s features available for loading in Emacs.
However, some bigger projects instead collect the source files in a subdirectory, to avoid having too many entries at the top, which could distract from other root contents, or simply make the root directory view so large that “README” text after it is buried somewhere deep down. One example of such a project is Magit.
It’s easy to configure Eldev to understand such a layout. Simply add
the following to project’s file Eldev
:
(setf eldev-project-source-dirs "lisp")
As the name of the variable implies, you can also have several subdirectories if you want. For example, for project resources:
(setf eldev-project-source-dirs '("lisp" "resources"))
Directory names are not fixed and can be anything. Another option
could be src
, for example.
Autoloads
Autoloaded functions of installed Elisp packages can be
accessed without a require
form. To simplify development, Eldev
provides the same functionality for projects regardless of loading
mode, as long as file PACKAGE-autoloads.el
exists. This might look
like an unwieldy requirement, but luckily there is
a plugin for building the file and keeping it
up-to-date as necessary. The reason this is not enabled by default is
that many projects — especially those not providing user-visible
functionality, or those that consist of a single file — don’t have any
autoloading functions or other forms.
Local dependencies also have their autoloads activated regardless of loading mode. If the autoloads file is kept up-to-date using the plugin, Eldev will take care to do this as needed in local dependencies too.
Build system
Eldev comes with quite a sofisticated build system. While by default
it only knows how to build packages, byte-compile .el
files and make
.info
from .texi
, you can extend it with custom builders that
can do anything you want. For example, generate resource files that
should be included in the final package.
The main command is predictably called build
. There are also
several related commands which will be discussed in the next sections.
Targets
Build system is based on targets. Targets come in two kinds: real
and virtual. First type of targets corresponds to files — not
necessarily already existing. When needed, such targets get rebuilt
and the files are (re)generated in process. Targets of the second
type always have names that begin with “:” (like keywords in Elisp).
Most import virtual target is called :default
— this is what Eldev
will build if you don’t request anything explicitly.
To find all targets in a project (more precisely, its main
target set):
$ eldev targets
Project’s targets form a tree. Before a higher-level target can be built, all its children must be up-to-date, i.e. built first if necessary. In the tree you can also see sources for some targets. Those can be distinguished by lack of builder name in brackets. Additionally, if output is colored, targets have special color, while sources use default text color.
Here is how target tree looks for Eldev project itself (version may be different and more targets may be added in future):
:default bin/eldev [SUBST] bin/eldev.in :package dist/eldev-0.1.tar [PACK] bin/eldev [repeated, see above] eldev-ert.el eldev-util.el eldev.el :compile eldev-ert.elc [ELC] eldev-ert.el eldev-util.elc [ELC] eldev-util.el eldev.elc [ELC] eldev.el :package-archive-entry dist/eldev-0.1.entry [repeated, see ‘dist/eldev-0.1.tar’ above]
And a short explanation of various elements:
:default
,:package
,:compile
etc.-
Virtual targets. The ones you see above are typical, but there could be more.
bin/eldev
,dist/eldev-0.1.tar
,eldev-ert.elc
etc.-
Real targets.
SUBST
,PACK
,ELC
-
Builders used to generate target. Note that virtual targets never have builders.
SUBST
is not a standard builder, it is defined in fileEldev
of the project. bin/eldev.in
,eldev-ert.el
etc.-
Sources for generating targets. Certain targets have more than one source file. Also note how targets can have other targets as their sources (
bin/eldev
is both a target on its own and a source fordist/eldev-0.1.tar
). [repeated ...]
-
To avoid exponential increase in tree size, Eldev doesn’t repeat target subtrees. Instead, only root target of a subtree is printed.
Target cross-dependencies
FIXME
Target sets
Eldev groups all targets into sets. Normally, there are only two
sets called main
and test
, but you can define more if you need
(see variable eldev-filesets
). For example, if your project
includes a development tool that certainly shouldn’t be included in
project’s package, it makes sense to break it out into a separate
target set.
Target sets should be seen only as ways of grouping targets together
for the purpose of quickly enumerating them. Two targets in the same
set can be completely independent from each other. Similarly, targets
from different sets can depend on each other (provided this doesn’t
create a circular dependency, of course). For example, targets in set
test
will often depend on those in set main
, because test .el
files usually require
some features from main
.
By default, command build
operates only on main
target set. You
can use option --set
(-s
) to process a different target set. If
you want to build several sets at once, repeat the option as many
times as needed. Finally, you can use special name all
to order
Eldev to operate on all defined sets at once.
Command targets
instead of the option expects set names as its
arguments. For example:
$ eldev targets test
Building packages
To build an Elisp package out of your project, use command package
:
$ eldev package
This command is basically a wrapper over the build system, it tells
the system to generate virtual target :package
. However, there are
a few options that can only be passed to this special command, not to
underlying build
.
Files that are copied to the package .tar
default to Elisp files
plus standard documentation: *.info
, doc/*.info
and files dir
along those. However, if you need to package additional files, just
modify variable eldev-files-to-package
. Its value must be a
fileset. In particular, you can extend the default
value, making a composite fileset, instead of
replacing it. Here is an example from Eldev’s own file Eldev
:
(setf eldev-files-to-package
`(:or ,eldev-files-to-package
'("./bin" "!./bin/**/*.in" "!./bin/**/*.part")))
Here we instruct package builder to copy all files from subdirectory
bin
, except those with names ending in .in
or .part
,
additionally to the standard files. If you distribute your package
through MELPA, you will unfortunately need to repeat these
instructions in MELPA recipe.
Previously, the recommended way was
However, this is wrong because it assumes that
|
Normally, packages are generated in subdirectory dist
(more
precisely, in directory specified by eldev-dist-dir
variable). If
needed, you can override this using --output-dir
option.
By default, Eldev will use package’s self-reported version, i.e. value
of “Version” header in its main .el
file. If you need to give the
package a different version, use option --force-version
. E.g. MELPA
would do this if it used Eldev.
Finally, if you are invoking Eldev from a different tool, you might be
interested in option --print-filename
. When it is specified, Eldev
will print absolute filename of the generated package and word
“generated” or “up-to-date” as the two last lines of its (stdout)
output. Otherwise it is a bit tricky to find the package, especially
if you don’t use --force-version
option. As an optimisation, you
can also reuse previous package file if Eldev says “up-to-date”.
Byte-compiling
You can use Eldev to byte-compile your project. Indirectly, this can
be done by selecting appropriate loading mode for
the project or its local dependencies. However, sometimes you might
want to do this explicitly. For this, use command compile
:
$ eldev compile
You can also byte-compile specific files:
$ eldev compile foo-util.el foo-misc.el
Eldev will not recompile .el
that have up-to-date .elc
versions.
So, if you issue command compile
twice in a row, it will say:
“Nothing to do” the second time.
However, simple comparison of modification time of .el
and its
.elc
file is not always enough. Suppose file foo-misc.el
has form
(require 'foo-util)
. If you edit foo-util.el
, byte-compiled file
foo-misc.elc
might no longer be correct, because it has been
compiled against old definitions from foo-util.el
. Luckily, Eldev
knows how to detect when a file require
s another. You can see
this in the target tree:
$ eldev targets --dependencies [...] :compile foo-misc.elc [ELC] foo-misc.el [inh] foo-util.elc [...]
As a result, if you now edit foo-util.el
and issue compile
again,
both foo-util.elc
and foo-misc.elc
will be rebuilt.
Eldev treats warnings from Emacs’ byte-compiler just as that —
warnings, i.e. they will be shown, but will not prevent compilation
from generally succeeding. However, during
automated testing you might want to check
that there are no warnings. The easiest way to do it is to use
--warnings-as-errors
option (-W
):
$ eldev compile --warnings-as-errors
Command compile
is actually only a wrapper over the generic building
system. You can rewrite all the examples above using command build
.
If you don’t specify files to compile, virtual target :compile
is
built. This target depends on all .elc
files in the project.
However, there is a subtle difference: for compile
you specify
source files, while build
expects targets. Therefore, example
$ eldev compile foo-util.el foo-misc.el
above is equivalent to this command:
$ eldev build foo-util.elc foo-misc.elc
with .el
in filenames substituted with .elc
.
Byte-compiling complicated macros
Certain files with macros in Elisp cannot be byte-compiled without
evaluating them first or carefully applying eval-and-compile
to
functions used in macroexpansions. Because Emacs packaging system
always loads (evaluates) package files before byte-compiling them
during installation, this is often overlooked.
Unlike the packaging system, Eldev by default expects that .el
files
can be compiled without loading them first, i.e. it expects that
eval-and-compile
is applied where needed. This is the default
because it is much faster on certain files.
However, if your project cannot be byte-compiled without loading first
and you don’t want to “fix” this, you can ask Eldev to behave like the
packaging system using --load-before-compiling
(-l
) option:
$ eldev compile -l
Projects that can only be compiled with this setting should specify it
as the default in their file Eldev
:
(setf eldev-build-load-before-byte-compiling t)
You can find more information in section “Evaluation During Compilation” of Elisp manual.
Speed of byte-compilation
While not particularly important in most cases, speed of byte-compilation can become an issue in large projects, especially if they use lots of macros. Eldev tries to speed up byte-compilation by compiling the files in “correct” order.
This means that if, as above, foo-misc.el
require
s feature
foo-util
, then foo-util.el
will always be byte-compiled first, so
that compilation of foo-misc.el
can use faster, byte-compiled
versions of definitions from that file. This works even if Eldev
doesn’t yet know which files require
which.
When Eldev has to change the planned order of byte-compilation because
of a require
form, it writes an appropriate message (you need to run
with option -v
or -t
to see it):
$ eldev -v compile [...] ELC foo-misc.el Byte-compiling file ‘foo-misc.el’... ELC foo-util.el Byte-compiling file ‘foo-util.el’ early as ‘require’d from another file... Done building “sources” for virtual target ‘:compile’
Cleaning
While cleaning is not really part of the build system, it is closely
related. Cleaning allows you to remove various generated files that
are the result of other commands (not only build
). Command can be
executed without any arguments:
$ eldev clean
In this case, it removes byte-compiled Elisp files and any .info
files generated from .texi
/.texinfo
if you have those in your
project.
In general case, you can specify name one or more cleaners as
command arguments. All supported cleaners can be found using option
--list-cleaners
(-L
). Here is a short list of some of the more
useful ones:
.eldev
-
Delete Eldev’s cache, i.e. subdirectory
.eldev
for this project. distribution
(ordist
)-
Delete
dist
subdirectory; useful after building project’s package. test-results
(ortests
)-
Forget previous test results, for ERT.
global-cache
-
Remove contents of the global package archive cache. This can be executed from any directory.
all
(oreverything
)-
Run all available cleaners. Some cross-project data may still be retained (currently, only the global package archive cache), that can be cleaned only by explicitly mentioning it.
Cleaners executed by default are called .elc
, .info
and
info-dir
. Normally, they delete their targets in all
target sets at once. However, you can limit them to
main
, test
and so on set with option -s
(--set
), e.g. command:
$ eldev clean -s test
would delete all byte-compiled test files.
You can also specify option -n
(--dry-run
) to see what would be
deleted, without actually deleting it.
Testing
Eldev has built-in support for running regression/unit tests of your project. Currently, Eldev supports ERT, Buttercup, Doctest and Ecukes testing frameworks. Leave a feature request in the issue tracker if you are interested in a different library.
Simply executing
$ eldev test
will run all your tests. By default, all tests are expected to be in
files named test.el
, tests.el
, *-test.el
, *-tests.el
or in
test
or tests
subdirectories of the project root. But you can
always change the value of eldev-test-fileset
variable in the
project’s Eldev
as appropriate.
By default, the command runs all available tests. However, during development you often need to run one or a few tests only — when you hunt a specific bug, for example. Eldev provides two ways to select which tests to run.
First is by using a selector (framework-specific, this example is for ERT):
$ eldev test foo-test-15
will run only the test with that specific name. It is of course possible to select more than one test by specifying multiple selectors: they are combined with ‘or’ operation. You can use any selector supported by the testing framework here, see its documentation.
The second way is to avoid loading (and executing) certain test files
altogether. This can be achieved with --file
(-f
) option:
$ eldev test -f foo.el
will execute tests only in file foo.el
and not in e.g. bar.el
.
You don’t need to specify directory (e.g. test/foo.el
); for reasons
why, see explanation of Eldev filesets below.
Both ways of selecting tests can be used together. In this case they are combined with ‘and’ operation: only tests that match selector and which are defined in a loaded file are run.
When a test is failing, a backtrace of the failure is printed. You
can affect its readability and completeness using options -b
(--print-backtrace
, the default) and -B
(--omit-backtraces
).
The first option accepts your screen width as an optional parameter;
backtrace lines get cut to the specified width. (Since 0.7 this can
also be specified as a global option that additionally affects all
other backtraces that are printed by Eldev.) Special value of 0 (the
default in Eldev) disables truncation of backtrace lines. Second
option, -B
, is surprisingly useful. In many cases backtraces don’t
actually give any useful information, especially when the tests
contain only a single assertion, and only clutter the output. If you
have different preferences compared to Eldev, you can customize
variable eldev-test-print-backtraces
in file
~/.config/eldev/config
.
How exactly tests are executed depends on test runner. If you
dislike the default behavior of Eldev, you can choose a different test
runner using --runner
(-r
) option of test
command; see the list
of available test runners with their descriptions using
--list-runners
option. Currently they are:
standard
-
Invokes test framework trying not to change anything. Some tweaks might still be activated as needed to make the options to command
test
work. simple
-
Like
standard
with a few Eldev-specific tweaks that I feel are useful (see--list-runners
output for details). This is the default runner. concise
-
Like
simple
, but progress output for ERT, Buttercup and Doctest is replaced by a single dot per passing test. Give it a try to decide if you like that or not.
If you always use a different test runner, it is a good idea to set it
as the default in file ~/.config/eldev/config
. Finally, you can
even write your own runner.
Frameworks
As stated above, Eldev supports ERT (Emacs built-in),
Buttercup, Doctest and
Ecukes testing frameworks. With the exception of
Doctest, you don’t need to specify which framework the project uses,
as the tool can autodetect that. But in some cases (for Doctest —
always) you may need to set variable eldev-test-framework
to either
'ert
, 'buttercup
, 'doctest
or 'ecukes
, as appropriate. It is
also possible to use more than one framework in a project,
see below. You don’t need to declare testing
package(s) as extra dependencies: Eldev
will install them itself when needed.
Eldev tries to provide uniform command line interface to the supported frameworks, but of course there are many differences between them.
ERT
ERT is the “default” testing framework and also an Emacs built-in. This means that no additional packages need to be installed and the framework is available on all non-ancient Emacs versions (at least all Eldev itself supports).
All functionality of test
command works with ERT.
Buttercup
Buttercup is a behavior-driven
development framework for testing Emacs Lisp code. Its support in
Eldev has some limitations. On the other hand, certain functionality
is not supported by the library itself, and e.g. its bin/buttercup
script also doesn’t provide similar features.
When using Buttercup, selectors are patterns from the library’s documentation. I.e. they are regular expressions in Emacs syntax, and only tests with names matching at least one of the specified selectors/patterns are executed.
Things that won’t work with Buttercup at the moment:
-
option
--stop-on-unexpected
(-s
); -
specifying screen width with option
--print-backtraces
(-b
): it will always work as if 80 was specified.
Unlike ERT, Buttercup also has no special selectors that base on the previous run’s results.
Doctest
Doctest is a framework that allows you to embed test expressions into source code, e.g. function documentation. As a result, your unit tests turn into documentation that your users can read.
Because the tests are located directly in the source code rather than
in separate files, there is no robust way to autodetect their
presence. For this reason, you must always declare
that you use this framework using variable eldev-test-framework
.
You can also freely mix it with other tests by employing
multiple test frameworks in the same project.
Support for Doctest in Eldev is quite limited and many things won’t work currently:
-
the framework has no notion comparable to selectors, so those will be ignored;
-
Doctest doesn’t print failure backtraces, so relevant options have no effect;
-
option
--stop-on-unexpected
is not supported.
Ecukes
Ecukes is a framework for performing integrational testing. Its support in Eldev is limited to “script” (a.k.a “batch”) mode: neither “no win” nor “win” mode is supported currently.
Instead of adding more command-line options, Eldev reuses its standard selector concept for all of Ecukes patterns, anti-patterns, tags and “run only failing scenarios” option. It depends on the contents of a selector:
|
Pattern: scenarios with names matching the |
|
Antipattern: scenarios matching the |
|
Scenarios with given tag (i.e. |
|
Scenarios with the tag are omitted |
|
Only scenarios that failed in the previous run are executed |
For example, command
$ eldev test @foo ~open
runs all scenarios tagged as @foo
with name that doesn’t contain
word open
.
Unlike the standard (Cask-based) Ecukes test runner, Eldev prints
backtraces of failures in scenario steps by default. As for all other
supported frameworks, however, this can be disabled using option -B
(--omit-backtraces
). If your project uses only Ecukes tests and you
don’t like the backtraces being printed by default, you can always add
(setf eldev-test-print-backtraces nil)
to file Eldev
.
Option -X
(--expect
) is currently not supported for this
framework.
Multiple frameworks in one project
Eldev supports using test of different types in one
project, in any combination of supported frameworks.
In fact, its autodetection will work even in such cases. However,
especially when using different test types, it might be useful to set
variable eldev-test-framework
to a list of the frameworks the
project uses. E.g.:
(setf eldev-test-framework '(ert buttercup))
The order of elements in the list is important, as this will be the
order in which test
command calls the different frameworks.
Command test
will apply all its options and selectors to all
frameworks (autodetected or specified explicitly as above) at once.
Additionally, when tests of different types are invoked, the command
will print a short summary over all types.
Often, however, you don’t want to mix different test types and instead
run them using separate commands. This is especially useful when you
specify selectors, because those are often different across
frameworks. In this case you can use commands test-FRAMEWORK
or
their shorter aliases FRAMEWORK
. The syntax and behavior of these
commands is the same as that of test
, with the only difference that
only one, specified, framework is used. These commands are available
in all project. However, they are not “advertised”, i.e. not shown in
output of eldev help
, unless you set variable eldev-test-framework
to a list of at least two elements.
Example usage:
$ eldev test-ert $ eldev ecukes basics.feature
It is also possible to specify filesets that limit test file selection
for each framework, using variables eldev-test-FRAMEWORK-fileset
.
If you often use single-framework commands, these filesets can speed
up testing by not loading unneeded files. For example, if you have
ERT tests in one file called ert.el
and a lot of files with
Buttercup tests, you could add this to file Eldev
:
(setf eldev-test-ert-fileset "ert.el" eldev-test-buttercup-fileset "!ert.el")
Loading test files
There appears to be two common ways of using tests: 1)
they are loaded from project root; 2) subdirectory test/
(or
similar) in the project is added to load-path
. Eldev supports both.
First one is the default, since it doesn’t require anything in
addition.
To better understand the second way, imagine your project structure is like this:
tests/ test-helper.el test-my-project.el
and file test-my-project.el
includes a form (require
'test-helper)
. Naturally, this setup will work only if subdirectory
tests/
is in load-path
by the point tests are executed. To
instruct Eldev that your project needs this, add the following to file
Eldev
:
(eldev-add-loading-roots 'test "tests")
where 'test
is the command name and "tests"
is the name of the
subdirectory that should serve as additional loading root. In
principle, loading roots can also be used for other commands too, just
like extra dependencies.
If you want to switch to the first way and avoid special forms in file
Eldev
, replace (require 'test-helper)
with (require
'tests/test-helper)
.
Reusing previous test results
ERT provides a few selectors that operate on tests’ last results. Even though different Eldev executions will run in different Emacs processes, you can still use these selectors: Eldev stores and then loads last results of test execution as needed.
For example, execute all tests until some fails (-s
is a shortcut
for --stop-on-unexpected
):
$ eldev test -s
If any fails, you might want to fix it and rerun again, to see if the fix helped. The easiest way is:
$ eldev test :failed
For more information, see documentation on ERT
selectors — other “special” selectors (e.g. :new
or :unexpected
)
also work.
For Ecukes there is a comparable feature, though only for
failing scenarios. Internally it is implemted differently, as it is
built into the framework itself, but from the interface point of view
it works almost exactly the same: specify selector :failed
or
:failing
on the command line:
$ eldev test-ecukes :failed
Testing command line simplifications
When variable eldev-dwim
(“do what I mean”) is non-nil, as by
default, Eldev supports a few simplifications of the command line to
make testing even more streamlined.
-
For all frameworks: any selector that ends in
.el
(.feature
for Ecukes) is instead treated as a file pattern. For example:$ eldev test foo.el
will work as if you specified
-f
beforefoo.el
. -
For ERT: any symbol selector that doesn’t match a test name is instead treated as regular expression (i.e. as a string). For example:
$ eldev test foo
will run all tests with names that contain
foo
. You can achieve the same result with ‘strict’ command line (see also ERT selector documentation) like this:$ eldev test \"foo\"
If you dislike these simplifications, set eldev-dwim
to nil
in
~/.config/eldev/config
.
Linting
It might be useful to ensure that your source code follows certain standards. There are many programs called linters that can help you with this. Several of them are also supported by Eldev and can be executed using the tool.
In its simplest form lint
command will execute all supported linters
and let them loose on your source code in main
target set:
$ eldev lint
You don’t need to install anything additionally: Eldev will download and use required packages itself. Because of this, first linting in a project might take a while to prepare, but later the downloaded linters will be reused.
Currently, Eldev knows and uses the following linters:
-
Emacs built-in
checkdoc
. Verifies documentation strings of your functions, variables and so on for various style errors. -
package-lint
, which detects erroneous package metadata, missing dependencies and much more. -
relint
that can detects errors in regular expression strings in your source code. -
elisp-lint
that checks Elisp code for various errors — it is even more versatile thanpackage-lint
and actually optionally includes it.
In future, more linters can gain special treatmeant from Eldev (you
can also leave a feature request in the issue tracker). The full list
can always be found using command eldev lint --list
.
Running all the linters at once is not always what you want. In such a case you can just specify name (or several) of the linter you want on the command line:
$ eldev lint doc
Names can be simplified by dropping words “check” and “lint” from them. It is also possible to explicitly direct linters at certain files, rather than verifying all at once:
$ eldev lint re -f foo.el
Like with testing, you can omit -f
(--file
) option above as long as variable eldev-dwim
is non-nil.
Some projects, however, may decide to follow advices of certain
linters, but not the others. You can explicitly tell Eldev about
project’s policy by adjusting one or more of variables
eldev-lint-default
, eldev-lint-default-excluded
and
eldev-lint-disabled
in file Eldev
. All of these variables affect
which linters exactly Eldev starts when their names are not specified
explicitly.
Command lint
sets Eldev’s exit status to non-zero if there is at
least one warning from any requested linter. This simplifies using
linting in continuous integration should
you want to do that.
The doctor
Eldev comes with a way to “lint” the project and its checkout too. Just run:
$ eldev doctor
This will check the project for a few common potential problems. All the warnings come with quite verbose explanations, detailing what the doctor thinks is wrong, why and how to fix that. However, it’s up to you to decide what and if anything at all should be done — the doctor doesn’t, and actually cannot, fix anything itself.
Here is the current list of available checks, to give an idea of what it verifies:
eldev-presence
-
Checks if file
Eldev
is present. It is possible to run Eldev on projects that don’t have it, but some external tools will then not recognize the project as using Eldev. eldev-byte-compilable
-
Earlier examples would add
no-byte-compile
to fileEldev
, which is now not recommended. E.g. it would prevent flycheck-eldev and flymake-eldev from checking the file. explicit-main-file
-
Some projects have package headers in multiple files, which could confuse Emacs packaging system. If you have such headers only in one file, this test won’t complain.
explicit-emacs-version
-
A project should always explicitly state which Emacs version they need. Otherwise users could install it only to find it not working correctly in unpredictable way.
stable/unstable-archives
-
It is generally recommended to use package archives
gnu-elpa
,nongnu-elpa
ormelpa
instead of hardcoding stable or unstable variant. In this case you can switch easily using command line options. recent-stable-releases
-
Checks if there is a recent enough stable release of the project. Otherwise users who prefer stable releases will never see that your project has new features. However, this test will never complain if you don’t have any stable releases at all (i.e. if you follow “rolling release” model).
githooks
-
Checks if project-recommended Git hooks are activated in the working tree. Won’t complain if there are no Git hooks in the project.
up-to-date-copyright
-
Checks if the copyright notices in the project’s files appear up-to-date with the changes (it is a common mistake to forget updating those — I’ve seen that not only I’m prone to that).
eldev-file-owners
-
Targeted at some internal bug related to use of old Eldev versions and/or Docker. Hopefully you’ll never see this complain.
Quickly evaluating expressions
It is often useful to evaluate Elisp expressions in context of the
project you develop — and probably using functions from the project.
There are two commands for this in Eldev: eval
and exec
. The only
difference between them is that exec
doesn’t print results to
stdout, i.e. it assumes that the forms you evaluate produce some
detectable side-effects. Because of this similarity, we’ll consider
only eval
here.
The basic usage should be obvious:
$ eldev eval "(+ 1 2)"
Of course, evaluating (+ 1 2)
form is not terribly useful. Usually
you’ll want to use at least one function or variable from the project.
However, for that you need your project not only to be in load-path
(which Eldev guarantees), but also require
d. Luckily, you don’t
have to repeat (require 'my-package)
all the time on the command
line, as Eldev does this too, so normally you can just run it like
this:
$ eldev eval "(my-package-function)"
What Eldev actually does is requiring all features listed in variable
eldev-eval-required-features
. If value of that variable is symbol
:default
, value of eldev-default-required-features
is taken
instead. And finally, when value of the latter is symbol
:project-name
, only one feature with the same name as that of the
project is required. In 95% of the cases this is exactly what you
need. However, if the main feature of the project has a different
name, you can always change the value of one of the mentioned
variables in file Eldev
.
It can also make sense to change the variable’s value in Eldev-local
if you want certain features to always be available for quick testing.
If you have larger scripts to evaluate (but still not as
standard as to abstract as a custom builder or a command), you can
evaluate expressions stored in a file using --file
(-f
) option:
$ eldev eval -f useful-expressions.el
Note that this is not performed using Elisp file loading. Instead,
the file is parsed as a sequence of Elisp expression and those are
then evaluated/executed in exactly the same way as if specified on the
command line. In particular, for command eval
results are still
printed after every expression, even if they all come from the same
file. As in several other places, when variable eldev-dwim
(“do
what I mean”) is non-nil, option -f
can be omitted as long as
filename ends in .el
. In other words, you can just invoke:
$ eldev eval useful-expressions.el
for the expected effect.
To simplify command-line usage a bit, commands accept multiple expressions in one parameter, to reduce the number of quotes to type. E.g.:
$ eldev eval "(+ 1 2) (+ 3 4)" 3 7
There are “magic” variables called @
, @1
, @2
and so
on, aimed at simplifying accessing to previous form results from
subsequent forms. Variable @
always holds the result of previous
form evaluation; it is thus not available in the very first form.
Variables @N
, where N
is a number, hold result of N
’s form (N
starts with 1, not 0). They can be used both in eval
and exec
,
even if the latter doesn’t print the results.
All forms for evaluation/execution are indexed with the same number sequence, regardless if they come from command line (including multiple forms per parameter) or from a file.
To illustrate:
$ eldev eval "(generate-new-buffer \"abc\") (buffer-name @)" #<buffer abc> "abc"
Or:
$ eldev eval "(+ 1 2) (+ 3 4) (* @1 @2)" 3 7 21
Commands eval
and exec
also have a few options aimed
at code profiling. See relevant section below.
Running Emacs
Sometimes you want to run Emacs with just your project installed and see how it works without any customization. You can achieve this in Eldev easily:
$ eldev emacs
This will spawn a separate Emacs that doesn’t read any initialization
scripts and doesn’t have access to your usual set of installed
packages, but instead has access to the project being built with Eldev
— and its dependencies, of course. Similar as with eval
and exec
commands, features listed in variable eldev-emacs-required-features
are required automatically.
Eldev currently doesn’t support running Emacs in terminal
mode (this might be even impossible in Elisp given Eldev’s design).
If your Emacs is configured --with-x (as is usual), this shouldn’t
be a problem. Otherwise it will only function if you run it in batch
mode.
|
You can also pass any Emacs options through the command line. For
example, this will visit file foo.bar
, which is useful if your
project is a mode for .bar
files:
$ eldev emacs foo.bar
See emacs --help
for what you can specify on the command line.
When issued as shown above, command emacs
will pass the rest of the
command line to Emacs, but also add a few things on its own. First,
it adds everything from the list eldev-emacs-default-command-line
,
which disables ~/.emacs
loading and similar things. Second, it
transfers variables listed in eldev-emacs-forward-variables
to the
child process (this is done in order to keep
project isolation promises). Third, adds
--eval
arguments to require the features as described above. And
only after that comes the actual command line you specified.
Occasionally you might not want this behavior. In this case, prepend
--
to the command line — then Eldev will pass everything after it to
the spawned Emacs as-is (with the exception of still transferring
variables listed in eldev-emacs-forward-variables
). Remember that
you will likely need to pass at least -q
(--no-init-file
) option
to Emacs, otherwise it will probably fail on your ~/.emacs
since it
will not see your usual packages. To illustrate:
$ eldev emacs -- -q foo.bar
The whole Eldev infrastracture including its functions are
available by default in the launched Emacs too (can be changed in
eldev-emacs-autorequire-eldev
). This is done largely so that 1)
configuration (files Eldev
, Eldev-local
etc.) takes effect also in
the interactive Emacs; 2) various debugging
functions can be easily used there too.
Profiling
Eldev has limited built-in support for profiling your project. The largest limitation here is Emacs itself: it is difficult to receive a useful code profile, as provided functionality is really basic and crude.
Profiling is implemented as a “prefix” command, which takes the “real”
command as an argument. This way, anything can be profiled, though of
course it is really useful only with commands eval
, exec
and
test
, maybe occasionally with compile
and similar.
The basic usage is like this:
$ eldev profile --open eval "(my-project-function)"
Option --open
tells Eldev to send resulting profile(s) to your
normal Emacs and open there. It provides the quickest way to examine
results of profiling. For it to work, Emacs must run a server, see
function server-start
. Another alternative is to store results into
a file, using --file
option, though opening this in Emacs is
currently (28.1) fairly difficult. At least one of --open
and
--file
(can be abbreviated as -o
and -f
) is required.
Eldev is smart enough to start profiling only immediately before running code from your project, i.e. to avoid profiling project dependency installation (if that is needed) and so on. However, it is apparently not possible to exclude irrelevant stuff from backtraces, so those will, unfortunately, contain quite a lot of entries not related to your project.
Like with Elisp function profiler-start
, you can choose between CPU,
memory and both together using options --cpu
, --mem
and cpu-mem
(or -c
, -m
and -M
). Default profiling mode is CPU.
Useful evaluation options
Profiling is most useful with commands eval
and exec
.
Additionally, those have certain options targeted especially at
profiling, though they could, of course, be used in other cases.
Options --macroexpand
and --compile
(-m
and -c
) make the
commands preprocess their expressions accordingly; by default
expressions are simply evaluated using Elisp interpreter.
Option --repeat
(-n
) lets you evaluate the same expression(s)
multiple times. This is useful if normal evaluation is too fast for a
meaningful profile. If the command eval
is used, only the last
result is printed. However, any side effects (including printing)
will be observable that many times, so keep that in mind!
For example:
$ eldev profile -o eval -c -n 10000 "..."
Executing on different Emacs versions
Since Eldev itself is an Elisp program, version of Emacs you use can
affect any aspect of execution — even before it gets to running
something out of your project. Therefore, inside its “cache”
directory called .eldev
, the utility creates a subdirectory named
after Emacs version it is executed on. If it is run with a different
Emacs, it will not use dependencies or previous test results, but
rather install or recompute them from scratch.
Normally, Eldev uses command emacs
that is supposed to be resolvable
through PATH
environment variable. However, you can always tell it
to use a different Emacs version, though in this case, Eldev cannot
install it for you, since this is much more complicated than
installing Elisp packages. You need to make sure that Emacs of
desired version is available on your machine (and can be looked up via
PATH
) as an executable — either by installing via your OS’s package
manager, compiling from sources or maybe using EVM. Once you
have it installed, you can tell Eldev to use it by setting either
ELDEV_EMACS
or just EMACS
in the environment, e.g.:
$ EMACS=emacs25 eldev eval emacs-version
This is especially useful for testing your project with different Emacs versions.
Remember, however, that Eldev cannot separate byte-compiled files
(.elc
) from sources. From documentation of
byte-compile-dest-file-function
:
Note that the assumption that the source and compiled files are found in the same directory is hard-coded in various places in Emacs.
Therefore, if you use byte-compilation and switch Emacs versions, don’t forget to clean the directory.
Using Docker or Podman
Alternatively, if you are on a Linux or macOS system and have Docker or (since 1.10) Podman installed, you can run arbitrary Eldev commands within containers based on the images distributed by docker-emacs. For example:
$ eldev docker 27.2 emacs --eval '(insert (format "Emacs version: %s" emacs-version))'
will start an Emacs 27.2 container and run eldev emacs --eval
'(insert (format "Emacs version: %s" emacs-version))'
in it.
Podman, from Eldev’s point of view, works just the same. There is a special command for it, but it is basically indistinguishable (except for the name):
$ eldev podman 27.2 emacs --eval '(insert (format "Emacs version: %s" emacs-version))'
You may have to run xhost +local:root which allows the Docker
(Podman) container to make connections to the host X server. However,
this does come with some security considerations, see man xhost .
|
This command can be used not only to start Emacs of given version, but to run any Eldev command. For example, run project’s tests on an older editor version:
$ eldev docker 25.3 test
or evaluate something using project’s functions:
$ eldev docker 26.3 eval "(my-project-read-data \"foo.bin\")"
Docker’s (Podman’s) output is forwarded to normal Eldev output, however, because of Elisp limitations, it all ends up on Eldev’s stdout! There might also be unwieldy delays, so that output doesn’t come smoothly as generated by the process inside Docker, but instead in larger chunks. Before Eldev 1.2 the output would instead only appear when Docker has exited.
It is also possible to use a custom image. For this, replace Emacs version argument (26.3 in the last example above) with the full image name. The image must contain a preinstalled Emacs of a version supported by Eldev (i.e. 24.4 and up), but not Eldev itself.
Additionally, docker run
arguments are customisable via the variable
eldev-docker-run-extra-args
(and likewise for Podman:
eldev-podman-run-extra-args
). For example, adding the following to
your project’s Eldev
:
(setf eldev-docker-run-extra-args '("--name" "my_cool_container"))
will set the container name to “my_cool_container”.
Continuous integration
Because of Eldev’s trivial installation and built-in support for testing, it is a suitable tool for use on continuous integration servers. But of course this only applies if the test framework your project uses is already supported (currently ERT, Buttercup, Doctest and Ecukes).
Eldev will even try to make your CI runs more reliable.
GitHub workflows
The easiest option for continuous integration for GitHub-hosted projects are GitHub workflows, as this doesn’t involve using a 3rd-party service. Probably most of Elisp projects can take advantage of this, since GitHub appears to be the most popular hosting for Elisp projects. Workflow definition files for GitHub are somewhat more verbose than for Travis CI, but ultimately not really more complicated.
The easiest way to install Emacs binary of appropriate version is to
use jcs090218/setup-emacs
action (which
internally uses nix-emacs-ci). There are other
setup-emacs
actions around, but this one works across all operating
systems. Since EVM seems tuned to Ubuntu Trusty (i.e. what
Travis CI provides), it is likely unsuitable for GitHub workflows.
There is a simple action called setup-eldev
too.
It works on all GitHub-supported operating systems — Linux, macOS and
Windows — as well.
A basic workflow file (you can e.g. name it
.github/workflows/test.yml
) would look something like this:
name: CI on: push: paths-ignore: - '**.md' pull_request: paths-ignore: - '**.md' jobs: test: runs-on: ubuntu-latest strategy: matrix: emacs_version: # Add more lines like this if you want to test on different Emacs versions. - 26.3 steps: - name: Set up Emacs uses: jcs090218/setup-emacs@master with: version: ${{matrix.emacs_version}} - name: Install Eldev uses: emacs-eldev/setup-eldev@v1 - name: Check out the source code uses: actions/checkout@v4 - name: Test the project run: | eldev -p -dtT test
Eldev’s terminal autorecognition doesn’t work on GitHub machines
(unlike e.g. on Travis CI). If you want colored output from Eldev,
you need to explicitly enable it using -C
(--color
) global option.
Travis CI
Travis CI is perhaps the most used continuous integration service for Elisp code, at least until the addition of GitHub workflows. The largest problem on Travis CI is to install Emacs binary of the desired version. Luckily, there are tools that can be used for this: at least EVM and nix-emacs-ci.
EVM
One of the tools to install Emacs is EVM. Steve Purcell
(the author of nix-emacs-ci
) mentions “various issues” he has had
with it, however many projects use it. Apparently, you need to fix
Ubuntu distribution used at Travis CI to Trusty for EVM-provided
binaries. Also note that EVM provides binaries only for Linux, so if
you want test on macOS too, nix-emacs-ci
is a better choice.
If you also want to try it, Eldev provides a simple script
specifically for use on Travis CI that installs Eldev and EVM in one
go. Here is a simple project-agnostic .travis.yml
file that you can
use as a basis:
language: emacs-lisp dist: trusty env: # Add more lines like this if you want to test on different Emacs versions. - EVM_EMACS=emacs-26.3-travis install: - curl -fsSL https://raw.github.com/emacs-eldev/eldev/master/webinstall/travis-eldev-and-evm > x.sh && source ./x.sh - evm install $EVM_EMACS --use script: - eldev -p -dtT test
nix-emacs-ci
A newer tool to install Emacs is nix-emacs-ci. Using
it is easy: define environment variable EMACS_CI
with the desired
Emacs version and curl
a single shell script — whether on Linux or
macOS. With one more line you can also install Eldev. It appears to
be slower than EVM, but for continuous integration that’s not terribly
important.
A basic .travis.yml
would look like this:
language: nix env: # Add more lines like this if you want to test on different Emacs versions. - EMACS_CI=emacs-26-3 install: - bash <(curl https://raw.githubusercontent.com/purcell/nix-emacs-ci/master/travis-install) - curl -fsSL https://raw.github.com/emacs-eldev/eldev/master/webinstall/travis-eldev > x.sh && source ./x.sh script: - eldev -p -dtT test
CircleCI
Another frequently used service is CircleCI. I don’t know that much about it, presumably nix-emacs-ci can be used to install Emacs on it. Some projects successfully use Docker images.
Regardless of how you install Emacs, adding Eldev is yet another
one-liner. It is handy to use, because propagating PATH
modifications between different commands on CircleCI is somewhat
non-obvious. To use it, add the following lines in the relevant place
in file .circleci/config.yml
:
... - run: name: Install Eldev command: curl -fsSL https://raw.github.com/emacs-eldev/eldev/master/webinstall/circle-eldev > x.sh && source ./x.sh
Script commands
Once you have Emacs with Eldev set up on the continuous integration
server of your choice, it is time to actually test your project. The
most basic command is, naturally, eldev test
. You might want to add
a few options to both make project loading more similar to that
typical for your users and Eldev’s output more informative:
$ eldev -p -dtT test
To make sure that your project byte-compiles cleanly, use the following command:
$ eldev -dtT compile --warnings-as-errors
Or maybe even this, if you want to make sure that test .el
files
also can be byte-compiled without warnings (this can sometimes catch
more problems):
$ eldev -dtT compile --set all --warnings-as-errors
You can also enforce conformance to certain coding standards by adding
an invocation of lint
to the script
part. Remember, however, that
most linters are continuously being developed. Even if a linter finds
your source warning-free today, it might detect problems tomorrow.
relint
is probably one of the “safer” linters in this regard:
$ eldev -dtT lint re
Robust mode
Eldev can make your continuous integration runs more robust. This means that when facing certain externally-caused errors, Eldev will not give up immediately, but rather wait for a while and then retry — several times. This makes CI runs more reliable and less likely to fail because of reasons completely unrelated to your project.
This is a work in progress, as examples of such errors are pretty difficult to collect, as they are intermittent and happen only occasionally. As of 1.5, Eldev will retry if it fails to fetch contents of a package archive. In practice such an error has been observed at least with MELPA and is probably caused by MELPA updates being non-atomic, meaning that at the end of each archive rebuild, there is a relatively short period where its current contents is not properly reported. Another possibility might be network problems.
In any case, normally you don’t even have to do anything to get these
improvements in Eldev. They are activated using option
--robust-mode
(-R
), which by default has value “auto”. If this
value is unchanged, robust mode is inactive on normal machines
(e.g. when you run Eldev locally), but gets activated if environment
variable $CI
is true
, which appears to be an unwritten standard
for continuous integration servers. For example, it is set on
GitHub workflow servers. Anyway, you can
also set this option to “always” or “never” — as you prefer.
Debugging features
Eldev comes with lots of different options and other features that can help you debug problems both in your project, Eldev itself or your Eldev scripts.
File Eldev-local
provides a good place to install temporary advices,
overwrite Emacs functions etc. in the process of debugging certain
problems.
Debugging output
There are two common ways of debugging code: interactively, using a debugger program to step through the code and determine at which place things go wrong, and non-interactively, by adding debugging output to relevant places in the code. While Eldev cannot really help you with interactive debugging, it provides several functions useful for generating debugging output.
The main such function is predictably called
eldev-debug
. It accepts the same arguments as message
and writes
the output to stderr in a standing-out color.
If you launch interactive Emacs with your project,
output of the function comes to Emacs stderr, i.e. the console from
which you have started up Eldev. This makes debugging output more
visible and easily relatable to things going on in Emacs: you have two
separate OS-level windows and just need to arrange them in such a way
that they don’t cover each other. Additionally, the
whole project setup procedure is repeated in such Emacs instances,
so e.g. files Eldev
or Eldev-local
can easily affect those too.
You can call eldev-debug
from your project’s code directly or from
an advice defined in Eldev-local
. E.g. you could temporarily add
something like this:
(defadvice my-function (around debug-it activate)
ad-do-it
(eldev-debug "my-function %S -> %S" (ad-get-args 0) ad-return-value))
This will log all calls to my-function
on the stderr, along with
arguments and produced result.
If you need to see how several functions call each other,
linear logs produced by advices as above may be too confusing. In
such a case you may be interested in macro
eldev-nest-debugging-output
, which indents all Eldev-generated
debugging output inside its body. It is also useful to split output
shown above into two lines: one for when a function gets called and
another — when it returns:
(defadvice my-function-1 (around debug-it activate)
(eldev-debug "my-function-1 %S" (ad-get-args 0))
(eldev-nest-debugging-output ad-do-it)
(eldev-debug "-> %S" ad-return-value))
Assuming you have similar advices for other functions that call each other, output might look something like this:
my-function-1 (...) my-function-2 (...) my-function-3 (...) -> [RESULT OF my-function-3] -> [RESULT OF my-function-2] -> [RESULT OF my-function-1]
Macro eldev-dump
reuses eldev-debug
to show the current value of
its argument variable(s) or even arbitrary form(s) in a human readable
way:
(eldev-dump load-prefer-newer (* 10 10))
prints
load-prefer-newer = t (* 10 10) = 100
This allows you to save some typing on debugging output forms that are meant to be temporary anyway and get changed quickly.
A pretty useful function to generate debugging output is
eldev-backtrace
. In some cases it is quite difficult to understand
how a function gets called (with given arguments). Seeing a
backtrace can help you out. Using different project
loading modes can be useful here: sometimes a
backtrace in a byte-compiled code is more readable and provides the
information you need, while in other cases you rather need
non-compiled forms.
Global option -u
(--cut-backtraces
) lets you make
backtraces more informative by omitting the common outermost frames.
This becomes particularly useful if you call eldev-backtrace
more
than once in order to compare the output. Eldev itself will put
“notches” in backtraces when it invokes project code, and will later
cut the result of eldev-backtrace
at the deepmost encountered notch.
You can also do the same — which might be even more useful — with
macro eldev-backtrace-notch
. Cutting is not applied to backtraces
that are printed because of errors in combination with option -d
.
Macro eldev-time-it
can be used to roughly estimate performance of
pieces of code and understand where your project becomes too slow.
Functions and macros above also come in “conditional” form: just add
x
before function name, e.g. eldev-xdebug
, eldev-xdump
, etc.
These functions by default do nothing, but can be activated or
deactivated using one of eldev-enabling-xdebug
,
eldev-disabling-xdebug
, eldev-maybe-xdebug
,
eldev-maybe-enabling-xdebug
and eldev-maybe-disabling-xdebug
macros, all with different semantics explained in their in-Emacs
documentation. Global option -x
also activates eldev-xdebug
-like
output initially (can still be deactivated programmatically later).
Appropriate usage can drastically reduce debugging output size in a
deeply nested calltree and limit it to only relevant (to the bug you
are investigating) portions.
Another technique to reduce debugging output is to activate it only
after the project has been fully loaded and initialized — depending on
the project, the setup can involve quite a lot of function calls on
its own, e.g. from computation of constants, eval-when-compile
forms
and so on. In this case it might be useful to install debugging
advices inside a with-eval-after-load
form.
Preventing accidental commits
It is, generally, a good idea to use Eldev calls only in files Eldev
and Eldev-local
. Actual project source code should be limited to
using Eldev only while debugging — in all other cases Eldev is hardly
a proper dependency, at least for most projects. However, as I have
found from experience, it is pretty easy to forget and accidentally
commit debugging output changes that were meant only as a temporary
aid for investigating one specific issue.
If you use Git as your VCS, you may want to employ Git hooks that prevent such accidental commits. Eldev even has a special command for that.
Debugging setup scripts
The following is an overview of Eldev features that can help you in debugging project setup scripts and Eldev — which is also not bug-proof — itself.
-
Global options
-t
(--trace
),-v
(--verbose
) and-q
(--quiet
) control the amount of output Eldev generates. The first one makes Eldev extra verbose, helping you to understand what it is doing and at which step something goes wrong. -
Global option
-d
(--debug
) makes Eldev print backtrace if it dies with a Elisp signal (except certain well-defined and explained errors like a missing dependency). -
Global option
-Q
(--backtrace-on-abort
) makes Eldev print backtrace if it is aborted with^C
. This is useful if your project freezes or has very bad performance, and you want to figure out where exactly this happens. -
Global option
-b
(--backtrace
) lets you adapt backtraces to your screen width and thus make them more readable at the expense of completeness (by default, Eldev doesn’t truncate backtrace lines). It is a good idea to change the default in file~/.config/eldev/config
. Starting with 0.10 this also affects backtraces printed if Eldev or code from the project it executes fails with a signal (see also option--debug
). -
Global option
-T
(--time
) prepends timestamps to all lines of Eldev output, making it easier to spot performance problems. -
Command
prepare
can be used to install all project dependencies — and thus check if they and package archives are specified correctly — without doing anything else. -
Commands
deps
(dependencies
) anddtree
(dependency-tree
) can be used to display list or tree of project dependencies, which is especially useful for large projects unfamiliar to you. -
For many errors, Eldev will print additional hints (unless you specify option
--quiet
). For example: if an error happens during evaluating fileEldev
, the tool will mention this; if a dependency cannot be installed, Eldev will mention what required this dependency (can be non-obvious in larger packages).
Plugins
Plugins are activatable extensions to Eldev functionality. They provide features that are not needed for most projects and are therefore not enabled by default. However, enabling a plugin is trivial — just add line:
(eldev-use-plugin 'PLUGIN-NAME)
to file Eldev
of your project. For example:
(eldev-use-plugin 'autoloads)
As for other configuration, you can also do it in Eldev-local
or
other places.
In future, plugins may become externally-managed and “detached” from Eldev itself (create an issue if you are interested). For now, however, Eldev provides three built-in plugins.
You can check if a project has any plugins activated — and documentation for those plugins:
$ eldev plugins
Run Eldev in quiet mode (-q
) to get only the list, without the long
documentation:
$ eldev -q plugins
Remember that if a project activates a plugin in a non-standard way,
for example from a hook, command plugins
will not see it.
There is currently no way to list all available plugins. However, as of yet there are only three plugins anyway.
autoloads
A plugin that enables automatic collection of functions
and other forms marked with ;;;###autoload
cookie in project’s .el
files. It tries to behave exactly the same as for installed Elisp
packages, so that there are no differences between development and
installed versions of the project.
The plugin is not on by default because many projects don’t use
autoloading functionality at all and having file
PACKAGE-autoloads.el
magically appear all the time in them would be
annoying.
To have autoloads automatically collected in your project, just
activate the plugin: add form (eldev-use-plugin 'autoloads)
to the
project’s file Eldev
. You don’t need any additional steps to
instruct Eldev how to use the generated file. In fact, it is able to
do this even without the plugin: the plugin only takes cares to build
and update the file as necessary.
If the plugin is activated, you can see new target :autoloads
in the
output of targets
command. In addition to being built by default,
this file is also generated whenever Eldev needs to load the project:
for commands test
, eval
, exec
and emacs
. Finally, the file is
also registered as a dependency to all .elc
targets in the project;
this way, byte-compiling always has access to up-to-date list of
autoloaded functions.
This plugin can also be activated in projects you use as local dependencies for other projects. Eldev knows how to keep the autoloads file up-to-date in all local dependencies, regardless of their loading mode.
maintainer
This is a special plugin that adds commands for project’s
maintainer use, in fact, currently just one: release
. Because not
every developer is a maintainer, this command is not enabled by
default and is instead available only when the plugin is active, to
avoid potential confusion. The recommended way to activate it,
therefore, is by adding code:
(eldev-use-plugin 'maintainer)
to your (as maintainer’s) personal file Eldev-local
. You can also
do this in ~/.config/eldev/config
to have the commands available
everywhere. If you are the only developer of your project (or simply
don’t care), you can of course do this right in file Eldev
.
The plugin defines a lot of variables that control its behavior. They
are available even if the plugin itself is not active, and it is
actually recommended to set them in file Eldev
, so that if the
plugin is activated, project-specific configuration is already
present.
Command release
The main plugin’s command is called release
:
$ eldev release VERSION
By default it runs interactively. As always in Eldev, this can be
changed from command line (option -N
makes the command
non-interactive) or customized in ~/.config/eldev/config
. The
command refuses to release the project if any validation step fails
(when running interactively, you can override this in some cases). It
won’t push any resulting commits: this is left for you to do
explicitly.
This command is relatively simplistic. If you need really advanced
functionality, try looking for an external tool, e.g.
Release-It. Some external tools may be blendable
into Eldev’s plugin via eldev-release-validators
etc., see
below.
There are four standard validation steps:
-
Check the version. It must be valid from Elisp point of view and larger than the current project’s version. You can also specify one of incrementors by name: “major”, “minor”, “patch” or “snapshot” to have a suitable version number automatically generated for you based on the current version. Normally you need to specify this on the command line, but when running the command interactively, you can also type it in at runtime.
-
Check the project’s working directory. It must be a Git or Mercurial checkout; the command won’t work in non-VCS directories or those belonging to a different VCS at all. The directory must not contain any modifications or unknown files (the latter can be “allowed” using option
--ignore-unknown
,-u
). Finally, Eldev can refuse releasing from certain branches, by default from anywhere but Git’s “master” or Mercurial’s “default”. -
Check the project’s files. Eldev will refuse to release if any file contains a run-together word “DONOTRELEASE” (configurable). This can be left as a reminder to self, for example, as a comment in any file in a place that must be fixed before making a release. This precaution can be skipped with option
--no-file-checks
or manually when running interactively. -
Optionally test the project by running its tests and byte-compiling. This step requires configuration and by default won’t do anything. This is actually not a problem if you have continuous integration properly set up. However, the plugin is currently unable to check results on CI-servers automatically and instead relies on you “knowing what you are doing”.
Testing can be skipped with option
--no-testing
in case you know it passes on a CI-server or really need to make a release immediately, even with known failures. In the interactive mode you can also choose to proceed even if some tests fail.
If the validation passes or if you have interactively chosen to ignore the errors, Eldev proceeds to create the release commit and tag the release. Depending on the project configuration, it may additionally create a post-release commit (not by default).
Command release
is written to be as generic as possible. Eldev
defines a lot of variables that control its behavior. However, they
are not available from the command line. Instead, they should be
customized in file Eldev
on a per-project basis; see the next
section.
Command update-copyright
The plugin provides a very simple command to update
copyright notices, based on Emacs’ built-in feature copyright
. You
can run it simply as:
$ eldev update-copyright
to update the notices everywhere in the project to include the current year, or as e.g.:
$ eldev update-copyright 2023
to include a specific year in the notices. Unlike command release
,
this doesn’t create any VCS commits: it leaves the files modified for
you to validate and commit the changes.
Maintainer project-specific settings
You can and most often should adapt the plugin to your project’s
needs. There are really lots of variables that control command
release
behavior. They have sensible defaults, but your project,
especially if it is complicated, still might need to tweak some of
them. For command update-copyright
you may want to change the value
of eldev-update-copyright-fileset
.
Not every variable is listed here. You can always check Eldev source
code and in-Emacs documentation to find all eldev-release-…
variables if you need yet more customization.
Project name and commit messages
First variable you might need to set is called
eldev-formatted-project-name
. By default it has no value and
Elisp-level package name is used in its place. This is often not a
bad default, but package names are typically all-lowercase. By
default the name also makes it into commit messages.
It might also be necessary to modify eldev-release-commit-message
and eldev-release-post-release-commit-message
. These variables
are accessible from command line, but if you always use standard
messages when releasing, you can save yourself some typing by entering
them once in file Eldev
. Messages set here may contain
placeholders, see function eldev-substitute
for details.
Validators
Three variables eldev-release-test-local
,
eldev-release-test-other-emacses
and
eldev-release-test-docker-images
let you configure local
testing before releasing. Standard configuration includes
no testing and Eldev simply relies on you knowing that the code is
bug-free, e.g. from continuous integration
results.
In addition to the four standard validation steps, you can define
additional validators by modifying hook eldev-release-validators
.
This lets you perform additional checks when releasing your project,
e.g. checking if the documentation is up-to-date. Macro
eldev-call-process
is particularly useful for implementing those in
case you need to invoke external (i.e. non-Elisp) tools.
Preparing the release
Once validation is complete, Eldev prepares the release commit.
Actually, the only standard step for this is updating package version.
This Eldev can do automatically whether you use package headers, or
have a prewritten …-pkg.el
file.
If you need to perform any other steps, you can use hook
eldev-release-preparators
. Functions hooked here may do whatever is
needed for your project. Any changes in the registered VCS files will
be included into the commit. However, if you create another file, you
need to register it by VCS first.
Post-release commit
Many projects include a post-release commit in their release flow
that e.g. bumps project version to a “snapshot” one, so that it is
always obvious if checked out source code corresponds to an official
release or not. To enable this for your project, set variable
eldev-release-post-release-commit
to t or a function; for example:
(setf eldev-release-post-release-commit
(lambda (version)
(let ((eldev-release-min-version-size 3))
(eldev-release-next-snapshot-version-unless-already-snapshot version))))
Here we say that we do want to bump a version, but only if it is not a
snapshot already (command release
lets you create snapshot releases
too). We also temporarily rebind eldev-release-min-version-size
to
ensure that snapshot
is added only after the patch (i.e. the third
component) number. If version
is returned unchanged, the
post-release commit is created, but without bumping the version. If
nil is returned, however, post-release commit is canceled altogether.
Like with the main commit, you can modify hook
eldev-release-post-release-preparators
to add special steps.
undercover
This built-in plugin provides integration with
undercover tool that generates coverage reports for
your tests. It is active only for command test
. By
default, behavior of the tool is unaltered (with the exception that
reports are not merged), so effectively it will do nothing unless run
on a supported continuous integration
server.
To have your project’s code coverage statistics automatically gathered during continuous integration, all you need to do is:
-
Activate the plugin: add
(eldev-use-plugin 'undercover)
to your project’s fileEldev
. -
Make sure that command
test
is executed during automated testing (e.g. in file.travis.yml
) inas-is
,source
orbuilt-source
loading mode. If you want, you can run it again additionally inpackaged
mode.
The plugin adds two options for command test
: --undercover
(-u
)
and --undercover-report
(-U
). First option can be used to
configure the plugin and the tool, the second — to change report
filename. Value for the option -u
should be a comma and/or
space-separated list of any of the following flags:
auto
,on
(always
),off
(never
)-
whether to generate the report; default value is
auto
; coveralls
,simplecov
,codecov
,text
-
format of the report to generate; default is
coveralls
; merge
,restart
-
whether to merge with existing report; note that by default report is restarted, i.e. existing report file is deleted;
send
,dontsend
-
whether to send the generated report to coveralls.io (only for the suitable format); default is to send.
Additionally, when eldev-dwim
is non-nil, certain flags can affect
each other:
-
if report format is not set explicitly, it is derived from extension of report filename if possible:
.json
forsimplecov
format,.txt
or.text
for a text report;codecov
format cannot be set this way, currently; -
when requested format is not
coveralls
, report is always generated unlessauto
oroff
(never
) is specified explicitly.
Based on the above, easiest way to generate a local coverage report is something like this:
$ eldev test -U simplecov.json
Full help for the plugin can always be checked by running eldev
plugins
in a project with the plugin activated.
Filesets
Filesets are lists of rules that determine a collection of files inside given root directory, usually the project directory. Similar concepts are present in most build tools, version control systems and some other programs. Filesets in Eldev are inspired by Git.
Important examples of filesets are variables eldev-main-fileset
,
eldev-test-fileset
and eldev-standard-excludes
. Default values of
all three are simple filesets, but are not actually restricted to
those: when customizing for your project you can use any valid fileset
as a value for any of these variables. However, for most cases simple
filesets are all that you really need.
Simple filesets
From Lisp point of view, a simple fileset is a list of strings. A
single-string list can also be replaced with that string. The most
important filesets are eldev-main-fileset
and eldev-test-fileset
.
Using them you can define which .el
files are to be packaged and
which contain tests. Default values should be good enough for most
projects, but you can always change them in file Eldev
if needed.
Each rule is a string that matches file path — or its part — relative
to the root directory. Path elements must be separated with a slash
(/
) regardless of your OS, to be machine-independent. A rule may
contain glob wildcards (*
and ?
) with the usual meaning and also
double-star wildcard (**
) that must be its own path element. It
stands for any number (including zero) of nested subdirectories.
Example:
foo/**/bar-*.el
matches foo/bar-1.el
and foo/x/y/bar-baz.el
.
If a rule starts with an exclamation mark (!
), it is an exclusion
rule. Files that match it (after the mark is stripped) are excluded
from the result. Other (“normal”) rules are called inclusion rules.
Typically, a rule must match any part of a file path (below the root,
of course). However, if a rule starts with /
or ./
it is called
anchored and must match beginning of a file path. For example, rule
./README
matches file README
in the root directory, but not in any
of its subdirectories.
If a rule matches a directory, it also matches all of the files the
directory contains (with arbitrary nesting level). For example, rule
test
also matches file test/foo/bar.el
.
A rule that ends in a slash directly matches only directories. But,
in accordance to the previous paragraph, also all files within such
directories. So, there is a subtle difference: a rule test/
won’t
match a file named test
, but will match any file within a directory
named test
.
Finally, note a difference with Git concerning inclusions/exclusions and subdirectories. Git manual says: “It is not possible to re-include a file if a parent directory of that file is excluded.” Eldev filesets have no such exceptions.
Composite filesets
Eldev also supports composite filesets. They are built using common set/logic operations and can be nested, i.e. one composite fileset can include another. There are currently three types:
(:and ELEMENT...)
-
A file matches an
:and
fileset if and only if it matches every of itsELEMENT
filesets. (:or ELEMENT...)
-
A file matches an
:or
fileset if and only if it matches at least one of itsELEMENT
filesets. (:not NEGATED)
-
A file matches a
:not
fileset when it doesn’t match itsNEGATED
fileset and vice versa.
Evaluated filesets
Finally, some parts of filesets — but not elements of simple filesets!
— can be evaluated. An evaluated element can be a variable name (a
symbol) or a form. When matching, such element will be evaluated
once, before eldev-find-files
or eldev-filter-files
start actual
work.
Result of evaluating such an expression can be an evaluated fileset in
turn — Eldev will keep evaluating elements until results finally
consist of only simple and composite filesets. To prevent accidental
infinite loops, there is a limit of eldev-fileset-max-iterations
on
how many times sequential evaluations can yield symbols or forms.
Example of an evaluated fileset can be seen from return value of
eldev-standard-fileset
function. E.g.:
(eldev-standard-fileset 'main)
=> (:and eldev-main-fileset (:not eldev-standard-excludes))
As the result contains references to two variables, they will be evaluated in turn — and so on, until everything is resolved.
Modifying filesets
Eldev contains quite a few variables with filesets that may be
modified by the projects, for example, eldev-test-fileset
,
eldev-standard-excludes
or eldev-files-to-package
. To modify
those, you should create a composite fileset
that refers to the previous value. For example like this:
;; Make sure included sample projects are not compiled etc.
(setf eldev-standard-excludes
`(:or ,eldev-standard-excludes "./samples"))
Previously Eldev documentation and its own source code would use
append
or push
to modify existing filesets. This turned out to be
a bad advice, because this implicitly assumes that the modified
filesets are simple and might lead to unexpected
results (stacktraces or filesets that don’t do what is expected) for
any other fileset type.
Starting with version 1.2 Eldev will print a warning
whenever it detects a “suspicious” modification in any of its standard
filesets, to avoid potential bugs: eldev-files-to-package
is no
longer a simple filesets starting with that version, some others may
be changed in the future too. If your project triggers this warning,
please modify the erroneous code (in file Eldev
or Eldev-local
) to
be similar to the example above. If you still get warnings, but are
certain that all fileset-modifying code is correct, you can set
variable eldev-warn-about-suspicious-fileset-var-modifications
to
nil
in file Eldev
.
Extending Eldev
Eldev is written to be not just configurable, but also extensible. It
makes perfect sense to have additional code in file Eldev
— if your
project has uncommon building steps. And also in
~/.config/eldev/config
— if you want a special command for your own
needs, for example. Or maybe in Eldev-local
— if you need something
extra only for one specific project that you maintain.
Hooks
Eldev defines several hooks executed at different times (more might be
added later). Due to historical reasons, Eldev doesn’t follow Emacs
naming convention on using -hook
only for standard hooks (i.e. those
not accepting any arguments) and -functions
in other cases.
Functions for many of the hooks listed below do receive arguments.
eldev-executing-command-hook (COMMAND)
-
Run before executing any command. Command name (as a symbol) is passed to the hook’s functions as the only argument. This is always the “canonical” command name, even if it is executed using an alias.
eldev-COMMAND-hook
-
Run before executing specific command, functions have no arguments. Eldev itself uses it (i.e. in its file
Eldev
) to print a disclaimer about its fairly slow tests. -
eldev-load-dependencies-hook (TYPE ADDITIONAL-SETS)
-
Executed after successfully loading dependencies. Functions are called with arguments
TYPE
andADDITIONAL-SETS
.TYPE
is eithert
if the project is being loaded for actual use, symbolload-only
if it is loaded only for side effect (e.g. to build a tree of its dependencies), andnil
if invoked fromeldev-load-extra-dependencies
(i.e. if the project is not being loaded at all: only some additional sets). The second is a list of additional dependency sets. -
eldev-before-loading-dependencies-hook (TYPE ADDITIONAL-SETS)
-
Similar to the previous hook, but called before dependencies are loaded. Function arguments are the same.
-
eldev-build-system-hook
-
Hook executed whenever build system is used. This is useful since at least commands
build
,compile
andpackage
invoke the build system: it would be impractical to add the same function to all three hooks. -
eldev-test-FRAMEWORK-hook (SELECTORS)
-
Called immediately before executing tests with given framework (ERT, Buttercup, Doctest, Ecukes). Functions on the hook get passed
SELECTORS
as the only argument. At this point project dependencies and additional settest
will have been loaded already, so functions canrequire
features from the project.
Writing builders
Eldev build system provides standard builders that cover all basic needs of Elisp packages. However, some projects have uncommon build steps. Instead of writing custom shell scripts, you can integrate them into the overall build process — which also simplifies further development.
An example of a project with additional build steps is Eldev itself.
Its executable(s) are combined from executable template that is
OS-specific and a common Elisp bootstrapping script. For example,
bin/eldev
is generated from files bin/eldev.in
and
bin/bootstrap.el.part
. However, only the first file counts as the
source; see how function eldev-substitute
works.
There is a simple builder for this in file Eldev
of the project:
(eldev-defbuilder eldev-builder-preprocess-.in (source target)
:short-name "SUBST"
:message source-and-target
:source-files "*.in"
:targets (".in" -> "")
:collect ":default"
:define-cleaner (eldev-cleaner-preprocessed
"Delete results of preprocessing `.in' files. This is specific
to Eldev itself."
:aliases prep)
(let ((modes (file-modes target)))
(eldev-substitute source target)
(when (or modes (string-prefix-p "bin/" target))
(set-file-modes target (or modes #o755)))))
Here eldev-defbuilder
is a macro much like defun
. It defines an
Elisp function named eldev-builder-preprocess-.in
and registers it
with parameters (the keyword lines before the body) as an Eldev
builder. Predictably, list (source target)
specifies function
arguments.
Let’s skip the keywords for a bit and have a look at the body. It
works exactly like in a normal Elisp function. Its job is to generate
target
from source
using builder-specific means. This particular
builder calls function eldev-substite
that does the actual work
(this function is available also to your project, should you need it).
But your builders could do whatever you want, including launching
external processes (C/C++ compiler, a Python script, etc.) and using
anything from Elisp repertoire. Note that return value of the body is
ignored. If building the target fails, builder should signal an
error.
Now back to the keyword parameters. As you can see, they all have a
name and exactly one value after it. First comes parameter
:short-name
. It specifies what you see in the target tree of the
project, i.e. builder’s name for the user. It is not required;
without it Eldev would have used preprocess-.in
as user-visible
name.
Next parameter is :message
. It determines what Eldev prints when
the builder is actually invoked. For example, when byte-compiling,
you’d see messages like this:
ELC some-file.el
That’s because byte-compiling builder has its :message
set to
source
(the default). Other valid values are target
and
source-and-target
(as in the example). Both source
and target
can be pluralized (i.e. sources-and-target
is also a valid value),
but singular/plural is not important in this case as both work
identically. Finally, value of :message
can be a function, in which
case it is called with the same arguments as the builder itself and
should return a string.
Value of :source-files
parameter must be a fileset. In
the above example, fileset consists of only one simple rule — which is
actually enough in most cases, — but it could also be much more
complicated. All files that match the fileset and do not match
eldev-standard-excludes
will be processed using this builder.
Parameter :targets
defines the rule used to construct target names
out of sources matched by :source-files
. There are several ways to
define this rule, we’ll consider them in their own
subsection.
Keyword :collect
determines how targets generated by this builder
are “collected” into virtual targets. In the example all such targets
are simply added to the virtual target :default
. However, here too
we have several other possibilities, which will be described
later.
Finally, keyword :define-cleaner
provides a simple way of linking
builders with the cleaning system.
Another important keyword is :type
. It is not used here only
because the example builder is of the default and most common type
that generates one target for each source file. All possible types
are: one-to-one
(the default), one-to-many
(several targets from
one source file), many-to-one
and many-to-many
. If you write a
builder of a non-default type, be aware that it will be called with a
list of strings instead of a single string as one or both of its
arguments, as appropriate. You should probably also name them in
plural in the definition in this case, to avoid confusion.
Target rules
Target rules define which target(s) will be built from given source(s). There are several ways to define a target rule. Yet more can be added in the future as real-world needs accumulate.
TARGET
-
All the sources will be passed together as a list to the builder to generate one
TARGET
. This is suitable formany-to-one
builders. (TARGET-1 [TARGET-2 [...]])
-
Build several
TARGETS
out of all the sources. This is formany-to-many
andone-to-many
builders. (SOURCE-SUFFIX -> TARGET-SUFFIX)
-
Build target name from source name by replacing filename suffixes.
SOURCE-SUFFIX
can also be a list of strings, in which case any suffix from the list will be replaced. This is the type of target rule you can see in the example and is suitable forone-to-one
builders. Another use of this rule type could be seen in byte-compiling builder::targets (".el" -> ".elc")
And the most powerful of all target rules: a function (can be a lambda
form or a function name). It is called with a list of sources (even
if the builder is of one-to-one
or one-to-many
type) and must
return one of the types enumerated above.
Collecting into virtual targets
Real targets generated by the builders can optionally be combined into
virtual targets. The latter are used to easily build all real targets
of the same type; some (:default
, :compile
etc.) also have
special meaning to certain commands.
Like with the target rules, there are several ways to collect the targets.
VIRTUAL-TARGET
-
All real targets generated by the builder are combined into given
VIRTUAL-TARGET
. This is what you can see in the example. (VIRTUAL-TARGET-1 [VIRTUAL-TARGET-2 [... VIRTUAL-TARGET-N]])
-
Combine the real targets into
VIRTUAL-TARGET-N
, then put it to the preceding virtual target and so on. This format is currently unused in standard Eldev builders. It can generate target trees of this form::gen-files :gen-sources :gen-el foo.el.in bar.el.in
It is expected (even if not required) that a different builder adds another branch to the tree, actually making it useful.
(ENTRY…)
, eachENTRY
being(REAL-TARGETS VIRTUAL-TARGETS)
-
Both of
REAL-TARGETS
andVIRTUAL-TARGETS
must be either a list or a single target string. For eachENTRY
this repeats the logic of one of the two formats above, but instead of all targets for the builder uses only those listed inREAL-TARGETS
for theENTRY
. This is not often needed, but can be useful if builder’s targets come in two or more substantially different kinds.
Like with target rules, you can specify a function here. Such a function gets called with a list of real targets and must return a collection rule in one of the formats listed above.
Summary
To define a builder you need to write an Elisp function that generates
target(s) from source(s). If it processes multiple sources at once or
generates multiple targets, give it the appropriate :type
. Write a
fileset that matches its :source-files
and a rule to determine
target names from those — parameter :targets
. If you want the
targets grouped together into virtual target(s), add :collect
keyword. You should probably also add a :define-cleaner
that
removes generated targets.
Parameters :name
, :short-name
, :message
and :briefdoc
are all
fully presentational and thus not very important. But if you want to
write a nice and polished builder, investigate them too.
Adding commands and options
Eldev has lots of standard commands, but sometimes you need to define yet more. Commands should generally be defined for things that cannot be reformulated in terms of building targets. If a command would just create a file, e.g. extract documentation from source code, an additional builder would be more suitable.
Defining a command is not much more complicated than defining a normal Elisp function:
(eldev-defcommand mypackage-parrot (&rest parameters)
"Repeat parameters from the command line."
:parameters "TEXT-TO-PARROT"
:aliases (copycat ape)
(unless parameters
(signal 'eldev-wrong-command-usage `(t "Nothing to say")))
(eldev-output "%s" (mapconcat #'identity parameters " ")))
Macro eldev-defcommand
works much like defun
, but additionally it
adds the new function to the list of Eldev command handlers. New
command receives name built from the function name by removing package
prefix. If that doesn’t produce the needed result in your case
(e.g. if package prefix is two words in your project), you can always
specify name explicitly by using :command
parameter. You can also
give your command any number of aliases, as shown above.
Keyword :parameter
describes what the command expects to see on the
command line. It is used when invoking eldev help COMMAND
to
improve documentation: all commands are automatically documented. The
short one-liner for eldev help
is derived from the function’s
documentation by taking the first sentence. If this is not good
enough in your case, use keyword :briefdoc
to set it explicitly.
When command is invoked from command line, Eldev calls the corresponding function, passing all remaining parameters to it as strings. The example command above just parrots the parameters back at user, in accordance to its name.
Defining options
You have probably noticed that the command function we’ve defined doesn’t accept any options. In fact, this is true for all commands in Eldev: options are not passed to them. Eldev takes a different approach: whenever a (recognized) option is encountered on the command line, appropriate function is called, which is supposed to alter global state. This way it is easy to share options between multiple commands when needed.
So, with that in mind, let’s expand our example command with an option:
(defvar mypackage-parrot-colorize-as nil)
(eldev-defcommand mypackage-parrot (&rest parameters)
"Repeat parameters from the command line. If you want, I can even
colorize them!"
:parameters "TEXT-TO-PARROT"
:aliases (copycat ape)
(unless parameters
(signal 'eldev-wrong-command-usage `(t "Nothing to say")))
(let ((text (mapconcat #'identity parameters " ")))
(when mypackage-parrot-colorize-as
(setf text (eldev-colorize text mypackage-parrot-colorize-as)))
(eldev-output "%s" text)))
(eldev-defoption mypackage-parrot-colorize (&optional style)
"Apply given STYLE to the parroted text (`section' if not specified)"
:options (-c --colorize)
:optional-value STYLE
:for-command parrot
(setf mypackage-parrot-colorize-as (intern (or style "section"))))
Definition of mypackage-parrot
is updated, but there is nothing
Eldev-specific here. Let’s rather have a look at the option
definition.
Unlike for command function, name of option function is not important.
Instead, how the option looks like on the command line is determined
by :options
keyword. It can specify any number of alternatives, but
they all must be either short-style (single -
followed by one
letter) or long-style (--
followed by a longer name) options. Some
options take a value; it is determined by parameter :optional-value
or :value
(if the value is mandatory) and must match arguments in
function definition.
Options can be either global or command-specific. In the latter case
— the one you’ll typically need — you define to which command(s) the
option applies using :for-command
parameter. In our case its value
is a single command, but it can also be a list of commands.
To test how the new option works, run:
$ eldev parrot -c Repeat this
It should print text “Repeat this” in bold, unless you’ve disabled output colorizing.
Note that the command doesn’t repeat “-c”, even though it appears on the command line. That’s because Eldev doesn’t pass the options as parameters to commands: only non-option arguments remain.
Documentation (i.e. output of eldev help parrot
) for the command we
defined above now automatically lists the accepted option:
Usage: eldev [OPTION...] parrot TEXT-TO-PARROT Command aliases: copycat, ape Options: -c, --colorize[=STYLE] Apply given STYLE to the parroted text (‘section’ if not specified) Repeat parameters from the command line. If you want, I can even colorize them!
Reusing options for new commands
Sometimes you want to define a command that is similar to
something provided by Eldev by default, yet does something a bit
differently — or in addition. Function eldev-inherit-options
may
come handy here: it takes all (or at least most) options that can be
passed to command A and make them work also with command B. See
function Elisp documentation for the syntax.
Custom test runners
FIXME
Influential environment variables
A few environment variables can affect Eldev’s behavior.
EMACS
orELDEV_EMACS
-
Use given Emacs executable (also for any child processes). If not specified, this defaults to just
emacs
, which is expected somewhere inPATH
. ELDEV_LOCAL
-
Load Eldev Elisp code from given directory (usually a Git clone of source tree) instead of the normal bootstrapping from MELPA. Should not be needed normally, only when developing Eldev itself.
ELDEV_DIR
-
Directory where user’s configuration file, Eldev’s bootstrapping files etc. are located. When not set, configuration files are looked up either in
~/.eldev
if that exists, else in~/.config/eldev
; runtime files are placed either in~/.eldev
or~/.cache/eldev
. Used by Eldev’s own regression tests, should be of no interest for typical use. CI
-
When this is set to
true
, Eldev’s robust mode is activated by default.
See also
Other build tools you might want to use instead of Eldev:
-
Cask — the most established Emacs project management tool.
-
makem.sh — a shell script that performs many common Elisp development tasks; must be copied to your project.
-
Eask — a build tool that can be seen as a successor to Cask; uses a similar project description file.
-
Keg — another alternative to Cask; likewise, uses a similar project description file.
-
makel — a prebuilt
Makefile
with typical targets useful to Elisp projects. -
EMake — build tool that combines Elisp with GNU Make.
Projects and services that can otherwise help you with developing your Elisp code:
-
EVM — Emacs version manager; has special support for Travis CI.
-
nix-emacs-ci — installer of different Emacs versions that uses Nix and Cachix; useful for continuous integration.
-
GitHub workflows — a part of GitHub available to any hosted project, which can be used for continuous integration among other things.
-
Travis CI — continuous integration service, the most used one for Elisp projects; Eldev has additional support for it.
-
CircleCI — another continuous integration service; Eldev provides a special installation script for it.
-
Coveralls — web service to help you track your code coverage over time; can be integrated with Eldev using a plugin;
-
undercover — a tool for generating test coverage reports for Elisp code; also see Coveralls above.