You can find here several small softwares that I wrote for my own use but that may be usefull to other. They are all under the GPL, so everyone is permitted to redistribute and/or modify them under the terms of the GPL
.Perl ADS
These perl scripts are used to access the ADS database in an easy way, directly from the shell prompt. They are usefull to build a bibtex database, retrieve pdf files of a list of paper, or to make simple statistics. There is two sets of scripts : the first one, somehow older, is based on LWP::UserAgent to access directly ADS web pages. The second is a rewrite of the latter using the Astro::ADS library.

Scripts using the LWP::UserAgent library
ads_lwp.tar.gz (4ko)
get_author.pl
produce bibliography in the bibtex format from ADS, given an author name and start/stop yearget_bib.pl
retrieve the bibtex entry of an article in ADS given its journal, volume and page.get_bibtex.pl
retrieve the bibtex entry of an article in ADS given its bibcode
Scripts using the Astro::ADS library.
ads_astro.tar.gz (4ko)
adsastro.patch
patch for the Astro::ADS library to take the Journal/Volume/Page information as an entryget_biblio.pl
retrieve a specific bibtex entry and the pdf of a given astro-ph or ADS articleprocess_list.pl
retrieve the pdf and bibtex entry of a list of mixed astro-ph and ADS (J/V/P) entriesget_astroph.pl
get the bibcode of an article published in astro-ph, given its id numberget_bibcode.pl
get the bibcode of an article published in ADS given its journal, volume and pageget_bibpdf.pl
retrieve the pdf of a given bibcodeSHARC II / CRUSH
In order to make batch reduction with CRUSH, I also wrote some shell scripts to simplify this process.
You have to change the config.sh to set the correct path for your configuration. You also need to create a target_sources file in the $LIST_DIR directory which list all your target sources name as reported in the fits files. Run the make_list.sh script to produce scan list for each target source. You can now reduce the complete project by running reduce_target.sh or individual sources with reduce_source.sh source_name

Scripts to reduce SHARC II project in batch with CRUSH.
crush_batch.tar.gz (4ko)
config.sh
define all the variable parameters/pathmake_list.sh
make the scan lists for target sourcesreduce_source.sh
reduce one sourcereduce_target.sh
reduce all the target sourcesI wrote a few IDL scripts I use to display astronomical maps taken with SHARC II and reduced with CRUSH. They made intensive use of the IDL Astronomy User's Library and IMDISP library.
You have to change the default path in read_scan.pro, and then you case use the plot_scan routine with a least one argument, the fits file name, to display the signal (default), the noise (/noise), the signal-to-noise ratio (/s2n), or the integrated time (/time). You can also smooth the map to the FWHM of the CSO (/smooth), force the correct aspect ratio of the image (/aspect), crop the pixel with integrated time less than a value (crop=value), put a cross on the pointed position (/cross), change the display of the axis - offset (default) or absolute (/type), change the contours wrt to time (contours_time=[0,5]), and suppress the wedge (/nowedge)
The plot_scan routine is in fact more geneal and could be use with any kind of astronomical image, as long as the structure returned by the read_scan routine is filled with the right values.

Scripts to display SHARC II data
reduced with CRUSH
crush_plot.tar.gz (4ko)
read_scan.pro
read the fits file; unit conversions; return a structure containing the dataplot_scan.pro
display the scan with various optionsGILDAS : Line fitting
Deriving line parameter from a UVFIT in the Grenoble Image and Line Data Analysis System is not an easy task. I developped a few program to be able to do it easily. First you should patch the gio library of GILDAS in order to be able to save uvfit result in fits format. The task gildas_fits or fits will then be able to save UVFITS result table in fits format (use standard fits format and number of bits=-32). The resulting fits file can then be used to derive line parameters, for example with the IDL script included, using the Markwardt IDL Library to make the fit. The very usefull TexToIDL library is also used to legend the plot. Uncertainties on the fitted line parameters are derived from spectra uncertainties, computed from the visibilities and thus does not required the complex process of deconvolution/reconvolution involved in the mapping of interferometric data.

Scripts to fit a gaussian line from a GILDAS UVFIT
file
gildas_line.tar.gz (4ko)
tofits.diff
small patch for the tofits.f file of the gio library of GILDASfit_line.pro
fit a gaussian line to the spectrummp_gauss.pro
define the fitted function, a gaussian with a continnumIDL ASURV
To easily use and test the Astronomical SURVival Analysis package, I wrote a simple C wrapper which can be directly be called, as a library, from IDL or other language/program. This work has been highly simplified by the splitting of the original monolitical code of ASURV by the Starlink project. The univariate two sample tests, bivariate correlation tests and linear regression have been wrapped.
IDL Statistic
Two very usefull routines to plot the histogram and the partition function of a dataset. The first use the IDL histogram function but also compute the proper abscissa.

Statistical plotting routine for IDL.
idl_stat.tar.gz (4ko)
plot_histo.pro
draw an histogram of a variableplot_repart.pro
plot the repartition function of a variableIDL SDSS
I wrote some IDL routine to easily manipulate SDSS spectra. It has been tested on the DR3QSO catalog and spectra.
IDL Misc.
Here are a few IDL scripts that I wrote for every day use.

Miscellanous IDL scripts.
idl_misc.tar.gz (4ko)
zeta.pro
compute the zeta functionbarre.pro
draw an horizontal barconst.pro
define some physical constant (SI)copyright.pro
add copyright to your plotdraft.pro
add a draft comment to your plotThesis IAS
I adapted the thloria LaTex class for thesis wrote at IAS. You need to install the thloria class in order to use this one.
