Kompx.com or Compmiscellanea.com

CSS centering absolutely positioned elements

Windows : Internet Explorer 8.0+, Firefox 1.0+, Google Chrome, Opera 5.0+, Safari 3.1+, SeaMonkey 1.0+ [ 1 ].

Linux : Firefox 1.0+, Google Chrome / Chromium, Opera 5.0+, SeaMonkey 1.0+ [ 2 ], NetSurf 2.6+.

CSS horizontal centering of an absolutely positioned element. Example:

Image

HTML / XHTML. Code:

<div class="example">

<img src="image.jpg" alt="Image" />

</div>

CSS. Code:

.example {position: relative; left: 0px; top: 0px; height: 90px; width: 100%; float: left; padding: 10px; border: 1px #ccc solid; background: #fafafa; -moz-box-sizing: border-box; -webkit-box-sizing: border-box; -ms-box-sizing: border-box; box-sizing: border-box;}

.example img {position: absolute; left: 0px; right: 0px; margin: 0px auto; width: 68px;}

An absolutely positioned img is centered in the example. But this method of horizontal centering also works with other both inline and block absolutely positioned elements.

The width of an absolutely positioned element may also be in percent or other units.

CSS properties of a container (here it is .example), holding an element to be centered, may vary. The element's centering is achieved by styles applied to the element itself: .example img {position: absolute; left: 0px; right: 0px; margin: 0px auto;}.


[ 1 ]

As well as Netscape 8.01+, Mozilla 1.5+.

[ 2 ]

As well as Netscape 8.01+, Mozilla 1.5+.


Aliosque subditos et thema

 

Lynx browser. Creating sitemap.xml

 

There are more than few online services for sitemap.xml generation. But it is also possible to do it yourself, by means of lynx web browser and several Linux command line utilities. An example bash script employing them, named "sitemap.sh" is described below. Bash script creating a sitemap.xml file: #!/bin/bash cd /home/me/sitemap/www/ lynx -crawl -traversal -accept_all_cookies -connect_timeout=30 http://www.compmiscellanea.com/ > /dev/null cd /home/me/sitemap/www2/ lynx -crawl -traversal -accept_all_cookies -connect_timeout=30 http://compmiscellanea.com/ > /dev/null cat /home/me/sitemap/www2/traverse.dat >> /home/me/sitemap/www/traverse.dat cat /home/me/sitemap/www/traverse.dat | sed -e 's/\<www\>\.//g' | sort | uniq > /home/me/sitemap/sitemap/sitemap.xml sed -i 's/\&/\&amp\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i "s/'/\&apos\;/g" /home/me/sitemap/sitemap/sitemap.xml sed -i 's/"/\&quot\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i 's/>/\&gt\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i 's/</\&lt\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i 's/http:\/\//http:\/\/www\./g' /home/me/sitemap/sitemap/sitemap.xml sed -i -e 's/^/<url><loc>/' /home/me/sitemap/sitemap/sitemap.xml sed -i -e 's/$/<\/loc><\/url>/' /home/me/sitemap/sitemap/sitemap.xml sed -i -e '1 i <?xml version="1\.0" encoding="UTF-8"?>\r\r<urlset xmlns="http:\/\/www\.sitemaps\.org\/schemas\/sitemap\/0\.9" xmlns:xsi="http:\/\/www\.w3\.org\/2001\/XMLSchema-instance" xsi:schemaLocation="http:\/\/www\.sitemaps\.org\/schemas\/sitemap\/0\.9 http:\/\/www\.sitemaps\.org\/schemas\/sitemap\/0\.9\/sitemap\.xsd">\r\r<!-- created by sitemap.sh from http:\/\/www.compmiscellanea.com\/en\/lynx-browser-creating-sitemap.xml\.htm -->\r\r' /home/me/sitemap/sitemap/sitemap.xml sed -i -e '$ a \\r</urlset>' /home/me/sitemap/sitemap/sitemap.xml sed -i '/static/d' /home/me/sitemap/sitemap/sitemap.xml echo "...Done" After the bash script file is prepared: "chmod +x sitemap.sh" to make it executable. Download sitemap.sh in sitemap.sh.tar.gz archive ( After downloading and unpacking it, put a web site name with "www" instead of http://www.compmiscellanea.com/ and a web site name without "www" instead of http://compmiscellanea.com/ in the file. Replace "static" in the last line of the file by a string unnecessary links should possess to be removed. Then "chmod +x sitemap.sh". Then run sitemap.sh ). Commentary Download sitemap2.sh with line by line commentary in sitemap2.sh.tar.gz archive. Before running the bash script, three folders should be created. Since lynx browser may miss some links if a web site domain name to be crawled is put with or without "www", bash script runs lynx twice, crawling the web site by its name with "www" and crawling the web site by its name without "www". The two result files are put into two of these separate folders, here they are "/home/me/sitemap/www/" and "/home/me/sitemap/www2/". And "/home/me/sitemap/sitemap/" is for sitemap.xml created in the end. 1. Path to bash: #!/bin/bash 2. Going to a folder - lynx browser is going to put there the files obtained from crawling a web site with "www" in its name: cd /home/me/sitemap/www/ 3. Running lynx browser to crawl a web site.

Screenshots in DOS

 

There are several programs for taking screenshots in DOS. SNARF, for instance. Using this application succeeded in taking screenshots in most of the cases. Also, the screenshots' quality (.BMP files) by SNARF results to be the highest among the programs tested: ScreenThief, VideoThief, FLIP, GRABBER, SNARF. Using SNARF with default settings is straightforward, but there is a shortcoming - SNARF always saves screenshots to the folder where the user is currently in. That could be inconvenient or unacceptable. And there is no obvious way to change it. But there is a roundabout option. The initial idea had been found on this page. The result based on it: 1. SNARF [ Download ] 2. Open SNARF.EXE in a text editor in text mode (not hex), find snarf000.bmp and replace it for s:scn000.bin 3. Create a batch file, S.BAT for example, where besides a string for starting SNARF.EXE will be a command assigning the path to the folder screenshots will be saved into to a virtual drive S: The folder and path may be any: C:\SOFT\SNARF.EXE SUBST S: C:\SCREENS\ 4. Start SNARF: S [or S.BAT] 5. To take a screenshot: Alt + S There will be two beeps. The first at the beginning and the second one as a sign the process has completed successfully. After the screenshots are taken, go to the folder where they are saved in and replace the file extensions from .BIN to .BMP SNARF - Freeware.