Kompx.com or Compmiscellanea.com

CSS centering

1 ) CSS horizontal and vertical centering - 1

Centering a container with the whole content of a web page inside in the viewable area of a web browser by means of CSS. A box to contain the whole content of the page is CSS centered horizontally and vertically. More for modern web browsers : [ More ] : [ Open demo page ]

2 ) CSS horizontal and vertical centering - 2

Centering a container with the whole content of a web page inside in the viewable area of a web browser by means of CSS. A box to contain the whole content of the page is CSS centered horizontally and vertically. More conservative, than the previous method. Suitable not only for modern web browsers, but for older ones as well - like Internet Explorer 6 or earlier Maxthon. A shortcoming - it takes more efforts to maintain the CSS code compared to [ CSS horizontal and vertical centering - 1 ] method : [ More ] : [ Open demo page ]

3 ) CSS centering floated elements

Floated elements of unknown width are CSS centered horizontally : [ More ] : [ Open demo page ]

4 ) CSS centering absolutely positioned elements

CSS horizontal centering of an absolutely positioned element : [ More ]

5 ) CSS centering image

CSS horizontal centering of an image : [ More ]

6 ) CSS vertical alignment

CSS vertical alignment of a block element containing text and images. The method works for various combinations of inline and block elements : [ More ]

7 ) JavaScript + CSS centering

Centering content of a web page by means of JavaScript and CSS. A block containing the content of a page is JavaScript + CSS centered horizontally and vertically : [ More ] : Two cases :

1. A block is centered, if screen resolution is equal to or greater than 1024x768 : [ Open demo page ]

2. A block is centered, if screen resolution is equal to or greater than 1024x768 + mouse cursor is moved over a link in an element of the page content : [ Open demo page ]


Aliosque subditos et thema


Extract tar.gz


Extracting tar.gz files in Linux, command line: tar zxvf file.tar.gz - z : filter the archive through gzip [ 1 ] - x : extract files from an archive - v : list the files processed - f : use archive file The command extracts the contents of a compressed archive to the current directory. Tar creates an archive of one or several files. Then gzip is used to compress it. Or both processes are made at once by tar only, with corresponding options employed. The duality of nature - archived and compressed after - is reflected in the extension of the file ("tar.gz") and requires two procedures to be performed while extracting: decompressing and unpacking. Hence both z (decompress it) and x (unpack it) in the command. [ 1 ] Sources for the option letters description: tar(1) - Linux man page and LinuxCommand.org

Lynx browser. Creating sitemap.xml


There are more than few online services for sitemap.xml generation. But it is also possible to do it yourself, by means of lynx web browser and several Linux command line utilities. An example bash script employing them, named "sitemap.sh" is described below. Bash script creating a sitemap.xml file: #!/bin/bash cd /home/me/sitemap/www/ lynx -crawl -traversal -accept_all_cookies -connect_timeout=30 http://www.compmiscellanea.com/ > /dev/null cd /home/me/sitemap/www2/ lynx -crawl -traversal -accept_all_cookies -connect_timeout=30 http://compmiscellanea.com/ > /dev/null cat /home/me/sitemap/www2/traverse.dat >> /home/me/sitemap/www/traverse.dat cat /home/me/sitemap/www/traverse.dat | sed -e 's/\<www\>\.//g' | sort | uniq > /home/me/sitemap/sitemap/sitemap.xml sed -i 's/\&/\&amp\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i "s/'/\&apos\;/g" /home/me/sitemap/sitemap/sitemap.xml sed -i 's/"/\&quot\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i 's/>/\&gt\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i 's/</\&lt\;/g' /home/me/sitemap/sitemap/sitemap.xml sed -i 's/http:\/\//http:\/\/www\./g' /home/me/sitemap/sitemap/sitemap.xml sed -i -e 's/^/<url><loc>/' /home/me/sitemap/sitemap/sitemap.xml sed -i -e 's/$/<\/loc><\/url>/' /home/me/sitemap/sitemap/sitemap.xml sed -i -e '1 i <?xml version="1\.0" encoding="UTF-8"?>\r\r<urlset xmlns="http:\/\/www\.sitemaps\.org\/schemas\/sitemap\/0\.9" xmlns:xsi="http:\/\/www\.w3\.org\/2001\/XMLSchema-instance" xsi:schemaLocation="http:\/\/www\.sitemaps\.org\/schemas\/sitemap\/0\.9 http:\/\/www\.sitemaps\.org\/schemas\/sitemap\/0\.9\/sitemap\.xsd">\r\r<!-- created by sitemap.sh from http:\/\/www.compmiscellanea.com\/en\/lynx-browser-creating-sitemap.xml\.htm -->\r\r' /home/me/sitemap/sitemap/sitemap.xml sed -i -e '$ a \\r</urlset>' /home/me/sitemap/sitemap/sitemap.xml sed -i '/static/d' /home/me/sitemap/sitemap/sitemap.xml echo "...Done" After the bash script file is prepared: "chmod +x sitemap.sh" to make it executable. Download sitemap.sh in sitemap.sh.tar.gz archive ( After downloading and unpacking it, put a web site name with "www" instead of http://www.compmiscellanea.com/ and a web site name without "www" instead of http://compmiscellanea.com/ in the file. Replace "static" in the last line of the file by a string unnecessary links should possess to be removed. Then "chmod +x sitemap.sh". Then run sitemap.sh ). Commentary Download sitemap2.sh with line by line commentary in sitemap2.sh.tar.gz archive. Before running the bash script, three folders should be created. Since lynx browser may miss some links if a web site domain name to be crawled is put with or without "www", bash script runs lynx twice, crawling the web site by its name with "www" and crawling the web site by its name without "www". The two result files are put into two of these separate folders, here they are "/home/me/sitemap/www/" and "/home/me/sitemap/www2/". And "/home/me/sitemap/sitemap/" is for sitemap.xml created in the end. 1. Path to bash: #!/bin/bash 2. Going to a folder - lynx browser is going to put there the files obtained from crawling a web site with "www" in its name: cd /home/me/sitemap/www/ 3. Running lynx browser to crawl a web site.