The following idea is gotten from maven:
It is not recommended from me to use maven or gradle (wikipedia), this is another decision as the decision for this file tree. But the idea of maven file tree structure is proper to use, also outside maven or gradle using.
The basic for this idea of a well defined file tree is "convention for configuration". That was one of the important step from the older tool ANT from appache.org towards to maven. In ANT, or sometimes in simple make systems, or in software at all, there is no rule which files are stored where.
But the idea of "convention for configuration" is not the reason for the here presented approach.
configuration is not so hard to create.
The more important reason is, having a well order of files, especially also on using of several components.
Primary the maven file tree divides in the sources for the application itself (
in test sources (
test) and also in
Furthermore it defines where built file are stored (
build), and some
That is practical. Here some more additional ideas in this direction are presented,
more consequently as in maven or gradle.
The question how to deal with components (see next chapter How to separate components) is also a primary approach for this file tree. It is solved other than in maven or gradle.
The idea can be explained here as following, with some additional directories see chapters below:
The first level of sources in a working area or "sandbox" is always
The second level is
The second level is
The third level describes the kind of sources, for maven/gradle often Java sources.
Same strategy for
The third level with another kind of sources, here so named resource files for Java.
This is true of course for all
Of course all other kind of sources are sorted in here too,
Either a language have a usual known mnemonic, or you find a shorten. This system of language in the first level is favored for the gradle tree.
Documentation is outside the sources, maybe or proper. This appoach is not presented in the maven link above, but often found in practice.
Of course here also the third level for the language or kind of docu.
You can also introduce
This are the folder for that IDE files (Integrated Development Environment)
which are also located on the
See libs and tools on the source tree. This directory contains only the not files which organzizes loading of tools from internet. Not the loaded files.
This can be used as an directory, where external Libraries are loaded which can be found in the internet. Such an directory is sometimes able to found, and it is sensible. But in maven external libraries are often also stored in a systems directory. See libs and tools on the source tree
This directory contains tools for working, which are usual only simple jar files , batch files or shell scripts. Some of this files are delivered with a given version of an application, but some other can be reloaded from an internet archive too. See chapter libs and tools on the source tree
This directory contains in sub dirs files for projects for an IDE = Integrated Development Environment. See especially chapter Directories for the IDE beside src
A directory where build outputs where written. This directory should be cleaned if a 'build all' should be done. It can refer via symbolic link also to a temporary location maybe especially on a RAM disk. See <<#
This directory can be used to store some zip content of the whole
All in all, the really sources are all located in
src, and in
src there are
nothing of temporary stuff and Megabyte of generations
and content which can be found also in the internet.
If you make a zip from the
src directory, it contains all what is need
for exchange and saving. It’s not too much, manageable. Only content in
should be versioned.
From the view of maven or gradle with Java using orientation, the topic of components is clarified by Java itself:
Java knows a strong package structure, valid over the whole world. Any source file is well sorted in in this package structure. No conflicts exists, and all Java developer regard this approach.
What is it for an approach:
The package-path starts usual and accepted from all with the URL internet page address of the responsible company or with a commonly accepted name.
So, the Java core sources have the package path
Special Java sources start for example with
if they were created in the past from the Sun company.
Other ones start with
oracle or more consequently with
because they are written from a company which can be found in the internet with
Sources in the package path
org/w3c are from
www.w3c.org etc. pp
My own sources starts with
So, in a Java source tree, all components can be mixed without conflicts.
Of course the further entries in the package path are well sorted,
in responsibility to the company which determines the start of the path.
The company should be handle consequently, and confusions with other companies are excluded.
There is no company which starts its own sources for example with
or (I hope so) with
This is sometimes not well defined.
A software or some other technical things consist of modules. A module is an own described and testable unit which can be used in different kinds. A source file or some associated source files with maybe dependencies one another build a module.
An common understandment is, that a component is a greater unit than modules.
A component is either an assembly of modules that form an independent greater unit - or special in software it is an assembly of modules which are delivered together. All modules in a component have the same version and they are tuned together.
This is a sensible definition. Following this, a component has one repository (for example in git), with its versions.
Of course sub components can be defined. Each (sub) component has its repository, and a component consting of sub components has a repository with child dependencies.
For the view of Java: A jar file is a deliviring form of a component maybe consisting of sub components. You should find all the sources of one jar file in one repository. A jar file and its sources has a defined version.
From the view of C/++ developing: Some C or C++ sources which are commit together should be seen as a component. This is a version, with a responsible, and maybe in the delivering form of a library with header files, or even as source pool.
For other languages or also hardware description files it is adequate.
That is an interesting question.
The components may have the root of its sources inside the source tree in its own file structure.
src/main/cpp/myCompany/DepartmA +- .git +- include | +- headers.h +- src_Inner_file_tree +- srcxy.cpp
Then the sources of this component are independent of the maven file tree idea and they are also independent of any names (which may not have been reconciled).
This is the best way for separation. This may be also valid for Java, though Java sources are already well separated by the package tree.
For example usual in my Java sources I have the structure:
src/main/java/srcJava_vishiaBase/.git /org/vishia/... src/main/java/srcJava_vishiaGui/.git /org/vishia/...
Both components have an additional directory between
java/ and the Java package tree.
That is not mentioned for maven.
src/main/java/org/… without a component sub directory. Why?
Maven has another concept for components. Usual it uses complete jar files,
which have the component structure as jar file,
and this jar files are not part of the sources in your own source tree.
Often their are stored in a temporary folder (
and updated automatically from internet on demand.
This is the maven approach, get all over the internet from the world.
But this approach is not proper for all.
In particular, the questions "which version is used", "are all impacts considered"
and some more are not necessarily sufficiently clarified with such an approach.
The second drawback is an overly unexplained dependency on the Internet.
But exactly the approach "everything can be found in the world" is the core approach of maven.
Maven is oriented to large software packages - no limitations, hard disks have enough space.
This one is not recommended by me! And also often not proper for embedded control.
But the maven file tree is a proper idea.
Having a git repository (or other version tool) beside the sources you can see the versions in a git also in your working environment. You can compare with other versions, see what’s new in a new delivered, and all.
Note that you can have a
.git directory inside that sources:
or you can have a
.git file which refers the repository on another location:
.git contains: gitdir: path/to/repository/.git
The second one it preferred by me, because two reason:
On manual archiving (create a zip) you have not all stuff of the repositories in the zip
You have a mirror location which make it easier to compare, change, have experience.
From the view of Java it is clarified, see chapter comparison with Java package path. You can simple mix all Java sources in the same file tree. Because of the package path =^ component structure no confusion occurs.
For other platforms it is possible to define such an adequate package path. But unfortunately the C and C++ developing was done without view point to the internet in the whole world (world wide web). Thus, different approaches have been established.
A simple solution: Store one component in one concise directory. This is a very simple approach, and it works. You can/should also use sub directories if necessary.
For example, your own sources from your company should be stored in the working tree in
If you have several departments that can not agree one together in such things, you should sort in the sources in sub directories in your project tree:
You should anyway regard the specific sub structure of the file tree of this sources, see next chapter.
This is a similar approach as in Java, but not in any case related and accepted by the owner of sources. Hence it is your proper sorting decision. You should communicate this decision to the owner and talk about the Java package tree.
If you have concise name for software components, use it. It is not necessary that it is related to the URL, if the names are enough concise.
Sometimes one working tree is for one application, the other one is for elaborately tests, and a third working tree or Sandbox is for another application.
Then sometimes it should be clarified that all application should use exactly the same sources (test with the same sources should be clear, another application should be and proper behavior to the first one).
There are three ways to get it:
a) All components should be written in only one working tree. This is possible, the files should not be confusing because of the proper inner file tree. But the disadvantage: If you want to clarify, which sources are used, or deliver the sources with a zip file, it is too much. All applications are in one. Maybe also the overview about the used files is lost.
b) Each component (or specific determined components) should have its own working tree. That is also proper if the maintainer of the working trees are different persons, on different computers. Then, the content should be tuned, should be the same.
For that either all working trees uses the same version form a central repository (in network),
or the working trees are tuned by comparison of the file content.
… A diff viewer for files in a directory tree works usual fast. The presumption is: The directories should be present either in the same network, or, also possible, a interchanged data storage is used. ("Please give me your files on a stick, I compare it").
This opportunity is also able to use for temporary or for experience different versions.
c) Each component (or specific determined components) have its own working tree, but the directories of the used sub components are symbolic links (known from UNIX, but also able to use in Windows with
mklink /J linkname path/to/src).
Then, similar as a) changing of the source in one working tree (for one application) offers the changed sources immediately for the other application. Often this effect is desired. One is tested, changed, then without additional effort the other application or test application is tested with this same changes.
The ability c) helps to sort the files (better than a), and gives an opportunity to switch some components to the b) approach, and back to c), only by changing the link, instead copy or locally checkout another version, all what is necessary.
It means this is often the best approach.
D:/software/workingTreeA/src/main/cpp/compn_A/src/filex.cpp + +-----------+ + N:/networkdrive/software/workingTreeB/src/main/cpp/compn_A/src/filex.cpp
compnA should either a copy, compared, or a linked directory.
D:/software/workingTreeA/ can force an immediately change
in the other
N:/networkdrive/software/workingTreeB without effort.
For example another person can immediately test, or you start an remote test.
Note, for a network drive on windows you should use
mklink /D, a simple Junction does not work.
But: A Junction inside the same hard disk is proper seen from another PC in network.
mklink /J name targetdirpathcreates a so named Junction, which works only for a
targetdirpathon the same hard disk but it is seen proper in the network.
mklink /D name targetdirpathcreates a symbolic link, also able to any other drive such as also a network drive. This is the adequate to a symbolic link in UNIX. This command needs administrator rights to execute.
mklink /H name targetfilepathcreates a file link, so named 'hard link' similar as UNIX, whereby the
targetfilepathneed to be on the same drive.
Testing approach using git test abilities:
Of course, such scenarios are also offered in a Git test environment. But then all changes should first be committed without testing for the test, not the best approach in all cases where minor changes should be checked for compatibility.
A common approach also used for such systems as
Some necessary tool files are loaded anywhere in Windows user directories.
Then there are accessible. For example on Eclipse usage you find:
The disadvantage of this approach is:
If you need different versions for different work, but unfortunately on the same PC, it may get conflicts.
You have often not an overview which tools are used, which are present, are they currently or used ….
The amount of tool files seems to be near unlimited. Too much.
It is better to have an overview, have only really necessary files.
Hence in my work on a working file tree I prefer a directories
libs and also
src and all other.
libsshould contain loaded libraries, for Java approach this are especially
.jarfiles. For C/++ usage this can be also pre-compiled libraries.
toolscontains also often
.jarfiles, but not to use for compilation. There are small tools to work.
toolsdirectory does not contain the elaborately files for example for IDEs (Integrated Development Environments) in sizes of Gigabyte. It should only contain small tools in less MByte size and maybe only some shell scripts of batches.
To avoid using too much space, there is a possibility using links, see also Chapter Save the project files in src/main/IDE where links are also used and explained.
The tool files are usual able to found with proper versions in internet archives. Also the sources should be found beside this archives in internet, as best case using a Reproducible build approach, see vishia.org/Java/html/source+build/reproducibleJar.html.
So on delivering a source tree in this form for example in an zip archive this files need not be a part of them. They can be loaded from internet, only one time on creation or unpack of this source file tree. After them they are stable present, associated to the application, without conflicts to other application and independent of the internet.
To load this archives from internet a small
tools/minisys_vishia.jar is used
as part in the git archive as only one common. It contains the necessary
Wget as known linux cmd is not available unfortunately in a standard MinGW
installation, neither it is anyway a standard on any Linux System.
Hence it is provided with the
minisys_vishia.jar for all systems where Java runs. But
minisys_vishia.jar does more.
GetWebfile works with a
bom, a bill of material, see articel in german: Jeff Luszcz "Risiken bei Open-Source-Software: Warum eine Bill-of-Materials sinnvoll ist"
java -cp tools/vishiaMinisys.jar ... org.vishia.minisys.GetWebfile ... @tools/bomVishiaJava.txt tools/
… is for line continue).
bomVishiaJava.txt contains the re-check of the
vishiaMinisys.jar, and check and download of
vishiaGui.jar. The bom contains MD5 checksums. With it the already existing
vishiaMinisys.jar is checked whether the checksum is okay. It it is not so, a warning is outputted. The other files are loaded and checked (whether the download is correct). If there are existing (on repeated call), the MD5 checksum is build and compared. The MD5 checksum is noted in this archive. Hence it is not possible (with the safety of MD5) to violate the files all on server, downlaod process and on the own PC.
The next importance is: It is documented which files are used from where. Other systems loads some downloaded stuff in a home directory (
C:\Users... on Windows), not simple obviously which and from where. And the third importance is: The sources of this jar files are stored beside the jar file at the server. The jar files can be build reproducible (see https://www.vishia.org/Java/html5/source+build/reproducibleJar.html).
tools/vishiaBase.jaris a Java executable archive (class files) with about 1.2 MByte, which contains especially the JZtxtcmd script interpreter. That is used to generate the test scripts and for Reflection generation (further usage of sources). It is a necessary component. This file is downloaded from a given URL in internet. If necessary you can find the sources to this jar file beside the jar file in the same remote directory. With the sources you can step debugging the tools for example using the Eclipse IDE https://www.eclipse.org.
tools/vishiaGui.jaras Java archive contains the ability to execute the
SimSelectGUI which is used in
src/test/ZmakeGcc/All_Test/test_Selection.jzT.cmdto build and executed specific test cases. It also contains some other classes for example for the 'inspector' or the 'file commander'
In the maven or gradle approach beside
src there should be a
+-src | +-main | +-test ..... +-build
build is the destination folder for all built results, also for the end-used executable.
The executable (the last result of built) can be copied from there to a delivery directory.
The content of
build should be seen anyway as temporary.
The build process can be repeated any time and should be repeated in a 'clean all & rebuild' approach.
It may be recommended to use a RAM disk for this
build organized with a symbolic link
with the advantage that writing and the build process runs faster and the hard disk is not used for too much temporary stuff.
If you want to get a zip archive from the sources, you should not include the
build, zip only the
This idea is not the basis for the Maven or Gradle approach. Maven or Gradle can be seen also as a build system outside of an IDE
IDE = Integrated Development Environment such as Eclipse, Visual Studio or a specific IDE for embedded software.
Firstly, the IDE project files can be seen as part of the sources.
Then it should be stored below
src/main/IDE. But that has an disadvantage:
Often beside the project files of the IDE the temporary directories are created.
If they are inside
src\main and you make a fast zip backup with the sources,
the difference is: You may have only kByte or less MByte for the sources,
and 100.. MByte with the sources and the temporaries of the IDE. That’s worse.
Separation of the IDE solves this problem.
+-src | +-main | +-test ..... +-build +-IDE +-Platform_A +-Platform_B +-Test
As you see in the example tree you can have more as one IDE files for different platforms, for test, and maybe also different applications.
The IDE project files should refer the source files always starting with a back path
../../src/….) but that is not a problem.
As described in the chapter above the IDE files should be not part of the
they should be separated in
ÌDE beside the
But: Some of the IDE files should be strored in the software version, without additional effort. How to do?
There is a solution using hard links. What is a hard link?
A hard link is known in the UNIX world since ~1970. It means the same file content is available from different directories. In UNIX (or LINUX) you can create a hard link to an existing file with the cmd
ln path/to/existingfile linkedfile
Then the same file content is available also in
linkedfile from view of the current directory.
Such hard linked files are a little bit worse understandable by normal users. Hence this system was not available in the first Windows versions and not in DOS: But currently, windows supports hard links:
mklink /H linkedfile path/to/existingfile
It is the same as in UNIX/Linux but of course with other order of arguments :-(
Look on the following example:
For CodeComposerStudio (Eclipse based) the project files are stored in
src/main/IDE/MSP430/TimeIntr. The files there are stored as versioned.
src/main/IDE/MSP430/TimeIntr +- +createWorkPrj.bat +- HlinkFiles +- +clean.bat +- +clean_mklinkDebug.bat +- .ccsproject +- .cproject +- .project +- lnk_msp430fr4133.cmd +- targetConfigs\
+createWorkPrj.bat copies that files as hard link:
echo off set NAMEWS=IDE\MSP430\TimerIntr cd %~d0%~p0\..\..\..\..\.. echo cleans and creates a directory %NAMEWS% beside the src tree echo as Workspace for the Project. echo the Workspace can always removed again, contains only temp files. echo All real sources are linked to the src tree beside. if not exist src\main\%NAMEWS%\+createWorkPrj.bat ( echo ERROR not found: src\main\%NAMEWS%\+createWorkPrj.bat: faulty path cd if not "%1" == "NOPAUSE" pause exit /B ) if exist %NAMEWS% ( echo WARNING exists: %CD%\%NAMEWS% echo will be deleted, press abort ctrl-C to exit if not "%1" == "NOPAUSE" pause rmdir /S/Q %NAMEWS% ) mkdir %NAMEWS% cd %NAMEWS% echo creates a so named hard link, the files are the same as in this original directory mklink /H .cproject ..\..\..\src\main\%NAMEWS%\HlinkFiles\.cproject mklink /H .ccsproject ..\..\..\src\main\%NAMEWS%\HlinkFiles\.ccsproject mklink /H .project ..\..\..\src\main\%NAMEWS%\HlinkFiles\.project mklink /H lnk_msp430fr4133.cmd ..\..\..\src\main\%NAMEWS%\HlinkFiles\lnk_msp430fr4133.cmd mklink /H +clean_mklinkDebug.bat ..\..\..\src\main\%NAMEWS%\HlinkFiles\+clean_mklinkDebug.bat mklink /H +clean.bat ..\..\..\src\main\%NAMEWS%\HlinkFiles\+clean.bat mklink /J targetConfigs ..\..\..\src\main\%NAMEWS%\HlinkFiles\targetConfigs call +clean_mklinkDebug.bat dir if not "%1" == "NOPAUSE" pause
The result is a new created file tree on
IDE/MSP430/TimeIntr beside `scr`with
IDE/MSP430/TimeIntr +- +clean.bat +- +clean_mklinkDebug.bat +- .ccsproject +- .cproject +- .project +- lnk_msp430fr4133.cmd +- targetConfigs\ +- Debug\ +- Release\
Release are created from calling
what is another topic too. The other files are hard links to the versions files
If you change the project files, it is automatically changed also in the versioned directory.
But all temporary stuff is locally in
There is a pitfall: If an editor reads the given file, but on save it removes the file
and creates a new one with the same name, the hard link relation is broken.
You should know your tools.
Unfortunately working with hard links may not be accepted by all tools,
because it is not so familiar in Windows (known in UNIX since ~1970).
The solution for this problem is: Compare the file content from
src/main/IDE/… if you have expected changes, and eliminate bad tools.
The content of
+clean.bat in the chapter above is:
echo removes %~d0%~p0\Debug etc (*.db, *.sdf, *.user, .vs, x64 etc) if exist %~d0%~p0\debug rmdir /S/Q %~d0%~p0\Debug if exist %~d0%~p0\Release rmdir /S/Q %~d0%~p0\Release
it removes the temporary stuff, and supports a 'clean' approach.
+clean_mklinkDebug.bat creates a symbolic link, a 'junction' in windows
Release temporary directories to a
If you have this
%TMP% localized in a RAM disk, you save time and burden for your hard disk.
For a RAM disk you can use for example http://memory.dataram.com/products-and-services/software/ramdisk.
It is recommended if you have enough RAM in your system (>= 8 GByte).
Using 1 or 2 GByte as RAM disk is usually enough and sufficient.
echo off REM next statement: %~d0 is the drive from the calling path, %~p0 is the path from calling path. REM it changes to the directory where this file is stored. cd /D %~d0%~p0 call +clean.bat set DBG=%TMP%\TUI_EmbMC\TimerIntr if exist %DBG% rmdir /S/Q %DBG% mkdir %DBG% mkdir %DBG%\Debug mkdir %DBG%\Release if not exist Debug mklink /J Debug %DBG%\Debug if not exist Release mklink /J Release %DBG%\Release echo TestDebug >Debug\TestDebug.txt echo TestRelease >Release\TestRelease.txt pause
The same can be done for the
build directory on root of the working dir:
If you follow the approaches for components in chapter <#CompnGitRepos> then you have proper versions for all your components. But the versioning of the whole application is missing.
You should not the approach a) in chapter <CompnGitRepos>, not all applications and tests in one working tree. It may be confusing.
If you have a working tree for one component, maybe with test in the
you can have a git repository for the whole tree.
But this should not contain all files! It should be a parent git (or other version system)
which refers to the version of the sub repository.
I’m not favoring a git sub directory tree, it is too much git-oriented and unflexible. For example if you yet only want to have a copy, or sometimes use other version systems than git, or whatever else, it is more flexible to deal with separated repositories. You have the responsibility to adjust all manually, of course, but this is done anyway if you decide about the versions. Please follow the idea in the sub chapter:
You can/may/should have one file for each component which clones the repository from a remote location.
For git this is (example for the component
src/main/cpp +- +gitclone_src_emC.sh +- src_emC
version="2021-08-31" dstdir="src_emC" echo this shell script gets the $dstdir core sources of emC echo if not exists $dstdir: clone https://github.com/JzHartmut/src_emC.git cd `dirname $0` ##script directory as current if ! test -d $dstdir; then ##echo for the present clone the src_emC with tag "$version" as 'detached head': git clone https://github.com/JzHartmut/src_emC.git -b $version $dstdir ##git clone https://github.com/JzHartmut/src_emC.git srcvishia_emC cd $dstdir pwd echo touch all files with the timestamp in emC.filelist: #this file is part of test_emC, hence the .../tools exists: java -cp ../../../../tools/vishiaBase.jar org.vishia.util.FileList T -l:emC.filelist -d:. else echo $dstdir already exists, not cloned fi
You can use the clone with the given version. If you want to have the clone of the last master commit, for currently developing work, you can uncomment and comment the adequate lines.
You can choose a proper directory name, appropriate to the component name in your working tree, independent of the name in the remote git repository. But sometimes the name of the current git repository is also proper for the component’s name.
The clone write the working files in the given
Hence this file should be arranged beside the component’s directory.
This is proper because, instead the component you have the
+gitclone…sh files in a cleaned src tree.
This may be the delivering form of an application without all components.
Now you can either immediately execute the
which writes both the repository and the files in the component’s directory.
But this is against the recommendation in chapter Own and separated git repository for each component. It is possible, of course.
The other variant is also simple: clone firstly, then move the
to its specific other location, and write a
.git file instead.
Then checkout the files on the git-Revisionarchive-location to get a mirror.
org.vishia.util.FileList restores the timestamp of all files,
which is not done by a git checkout command respectively by the git clone.
See also restoreTimestampsInSrc.html