Tuesday, May 31, 2011

Resync MySQL Slave with Master

For various kinds of reasons, it is not unusual to find MySQL Slave instance in a broken state during the replication. The quick fix requires a perfect database backup of Master instance. From the backup file, two parameters must be extracted for restoration of data onto Slave Server:

bin-log filename: MASTER_LOG_FILE
bin-log position: MASTER_LOG_POS

Of course, the overall system performance may be affected during the backup operation. However, MySQL Master server can be kept running as normal without any interruption.

You may find it difficult to find all the switches required for a successful online backup operation. Fortunately, an example is here.

Before anything happens, you'll find it easier to manage all those by opening two separate terminal consoles for both Master and Slave instances.

Terminal console for Master server:

The following command (recommended to run under “sudo –s” environment) will dump all databases on Master instance:
$ mysqldump -uroot -p \-S /etc/mysql/mysql.sock \
--master-data --hex-blob --opt --single-transaction \
--comments --dump-date --no-autocommit \
--all-databases > target_backup.sql

Please make sure the path of socket file of MySQL Master server is correct.

With “--master-data” switch, Mysqldump program will make a comment for the bin-log filename and position which can be really useful for resynchronization afterwards.

With “--single-transaction” switch, a global lock is acquired for a very short time at the beginning of the dump while any action on logs happens at the exact moment of the dump.

With “--hex-blob” switch, image data with BLOB type can be well preserved inside the backup without any disruption.

Once finished, you'll get a perfect backup file from MySQL Master server named "target_backup.sql".

Terminal console for Slave server:

Now, it is time to put the data onto MySQL Slave server. Please beware that the socket file used here becomes “/etc/mysql2/mysql2.sock”. Don’t it mix up with that of Master instance.

$ mysql -uroot -p -S /etc/mysql2/mysql2.sock < target_backup.sql

Now, login to MySQL Slave server:

$ mysql –root –p –S /etc/mysql2/mysql2.sock


Please check if MySQL user role named “slaveuser” with “replication” privileges does exist on Slave server.

Then, execute the following commands to resynchronize Slave server:

mysql> STOP SLAVE;
mysql> RESET SLAVE;
mysql> CHANGE MASTER TO MASTER_HOST=’localhost’,
>MASTER_PORT=3306,
>MASTER_USER=’slaveuser’,
>MASTER_PASSWORD=’XXXXXXXX’,
>MASTER_LOG_FILE=’mysql-bin.XXXXXXXX’,
>MASTER_LOG_POS=XXXXX
>;
mysql> START SLAVE;

For the corresponding values of MASTER_LOG_FILE and MASTER_LOG_POS, please read through the heading comments inside “target_backup.sql” file.

To check the status, please issue the following command:

mysql> SHOW SLAVE STATUS;

It is useful to check the error log of MySQL Slave server to confirm whether everything is back on track.

Thursday, May 26, 2011

CeBIT Australia 2011

CeBIT 2011 is going to be held between 31 May 2011 and 2 June 2011. Both Technology Enthusiasts and Government Sectors are involved in this biggest event near Darling Harbour in Sydney. As I can imagine how many people will be there since last CeBIT event, it is definitely a good chance to meet top technologists and also listen to Julia about the new progress of the big plan for National Broadband Network.

Apart fromt the exhibition itself, there are upcoming hot topics to be discussed in the conferences. Major events include:
  • NBN Conference
  • Cloud Computing Conference
  • eGovernment Forum
  • eHealth Conference
  • Executive Briefing
  • Retail Conference
  • WebForward Conference
 Can't wait to be there and hopefully see you in the exhibition!

Sunday, May 22, 2011

PHP PDF generating over HTTPS issue on IE6 with blank page returned [REALLY SOVLED]

This is really a typical question raised at all time about mysterious IE6 PDF opening issue over HTTPS connection:


How can we successfully output PDF dynamically via HTTPS connection on various browser including IE6?


Whatever server-side language you use, you may encounter this yourself once or more in your life. As people trying to find a definite reason for this, it seems to be more than one anyway.


Say, using PHP, people encourage to add appropriate headers to make PDF stream opening on the fly directly in IE6 browser.



//size_of_stream is counted by bytes
//Display pdf text stream direct on the browser (IE bug fixed!)
//by setting the content-type


header("HTTP/1.1 200 OK");
header("Status: 200 OK");
header("Accept-Ranges: bytes");
header("Connection: Keep-Alive");


//Comment out for debugging purposes
//header("Cache-Control: public");
//Try setting 1 delta second for caching long enough for Adobe Addon to load PDF content
header("Cache-Control: public, max-age=1");
//Only need to specify User-Agent in Vary header as IE7 only accept that
//Default Vary header value is not welcomed at all
//Fixing IE7 bug for Vary header
header("Vary: User-Agent");
header('Pragma: public');
if (!is_null($size_of_stream)){header("Content-Length: ". $size_of_stream);}
header("Content-Type: application/pdf");
header('Content-Disposition: inline; filename="whatever.pdf"');
header("Content-Transfer-Encoding: binary\n");


This is almost true in many cases, including all the popular browsers like Firefox, Safari and Chrome. Yet there is still exception that PDF may not open properly on IE6, especially for small sized PDF stream.


Searching around for any server-side solution, it was really disappointed in terms of support of Internet Explorer 6.


Finally, I have a glimpse on a comment posted in the other forum which seems to work on almost any case with major version of Internet Explorer:


In Internet Explorer

  1. Select Tools
  2. Click Internet Options
  3. Select the Advanced tab
  4. Make sure "Do not save encrypted pages to disk" option near the bottom in the Security section is unchecked
This implies there is hardly any programming way to ensure PDF opening perfectly working.

Option "Do not save encrypted pages to disk" should only be enabled by default on Windows Server, whereas on most desktop PC the followings should be applied for security reason:


  1. Go to the Tools menu
  2. Click Internet Options
  3. Click the Advanced tab
  4. In the "Settings" box, scroll down to the section labeled "Security" 
  5. click to check the box next to the "Empty Temporary Internet Files folder when browser is closed" option
  6. Click OK to finish



This option does not delete cookies, but it will clear your cache of other files when you close your browser.

Now, the thing is how we are going to let people in an organization to follow this. This makes me unhappy:(




Related links:


http://joseph.randomnetworks.com/2004/10/01/making-ie-accept-file-downloads/#comment-670


http://robm.fastmail.fm/articles/iecachecontrol.html




However, this is not the end of the story.


To be enthusiastic for my work and diligent regarding all these, I keep digging deep into what's happening on a scenario which I have met in producing PDF on the fly.


For IE6, part of my PDF streams can be displayed correctly regardless of what option "Do not save encrypted pages to disk" disabled or not. When I tried to check the size of a successful PDF generated on the fly, they all show a bigger size like 120KB, 80KB or 76KB.


I did find some developer posts about the problem regarding PDF stream size to be displayed on IE. As I didn't remember wrong, someone mentioned about 8 Kilobytes in size to be minimum requirement on IE6.


When I go back to check that problematic PDF file by Firefox, it shows a size of 4 Kilobytes. Well, it's time to do an experiment on this.


Using PHP, it is easy to echo PDF stream first. Then you can calculate the size of stream by using function like strlen() to check size of string. Multiply this length by 1024, you'll get Kilobyte size yourself.


$len = strlen($pdf_stream)*1024; //length in bytes


Using simple algorithm to check it the length if it is less than 8192 bytes or not. When this is the case, you can first echo the PDF stream first.


echo $pdf_stream;


Then padding the spaces in advance:


for ($v=0;$v<=8192;$v++){
   echo ' '; //output space character
}


This makes sure the actual PDF output plus extra padding spaces occupying at least 8KB in size for IE compatibility.


Now, the problem is really solved across the browsers:)



Monday, May 16, 2011

Libgcc_s.so.1 bug on Linux

After a series of package installation onto Linux machine, some errors started to popup in the log. As found from a series of Google-search, it seems to be a problem widely spread among the software on Linux platform. This has been marked as a bug for attention.

Error message would be like this:
...libgcc_s.so.1: version 'GCC_4.2.0' not found (required by /usr/lib/libstdc++.so.6)...

This can happen on all kinds of software already installed onto the machine. People can barely find the cause until someone found a bug for this and suggest a fix - REMOVE IT!

However, it is safe to rename or move the problematic library to somewhere for retrieval. I would just rename it like this:

$ cd /destination_folder...
$ sudo mv libgcc_s.so.1 libgcc_s.so.1.bak

Sometimes you may want to check the file dependency before you take action:
$ ldd /destination_folder.../libgcc_s.so.1

The error should be gone so far.

Some useful materials can be found from the links below:
Fix "version `GCC_4.2.0' not found" for VMware 1.0.6 SErver and Ubuntu
`GCC_4.2.0' not found (required by /usr/lib/libstdc+

Wednesday, May 11, 2011

Base64 image fix for Internet Explorer

Although Internet Explorer 6 has been released for almost ten years, it is not likely that people are rushing to have a newer version, or even better alternative, of this piece of software among the hospitals.

For this famous version of Internet Explorer, trivial like unsupported image display in a way of data URI scheme seem to be annoying while nurses and doctors complain that they can't really see the chart images or even signature image (supposing in smaller size) on the web pages.

Although data uri images are widely supported by the other web browsers like Chrome, Firefox, Safari and Opera, there is strong reason for us to take care of IE. We have to accept that it is still a web browser which is widely spread among those ward computers, i.e., a niche market for web developer.

For any advancer who keeps using new stuff to catch the eyes, there will always be a dilemma on backward compatibility. By looking around in the sea of Google searches, I can barely find quite a lot of opinions on solving the problem. Most of them would urge you to keep focusing on the browsers. While some other developers suggest an intrusive way to modify almost every web page in the project, I prefer this way:


It was found from a blog article since 2005 but it sheds the light on a question:

Can we actually repair (fix) those images after they have been shown up on a incompatible browser?


The answer is yes. 


Here is the recipe:

  • JQuery core script file for integration.
  • A custom JQuery function in Javascipt file.
  • A min-sized PHP script file for Base64 image data processing.



By using client-side Javascript and an external PHP file, we can redirect in-line Base64 image stream data to external HTTP request for obtaining a compatible image object back from server-side image processing.

For nowadays web solution like PHP (v5.3) + JQuery (v1.6), it is not bad to review the source code from the above blog article and see what we can do for the new decade.


We would like to apply the fix with one function call fixBase64Image() once the web page is loaded completely. This function will search through the DOM elements and apply the fix to the target elements appropriately. A check on browser type and version is also possible to eliminate unnecessary action on non-related element. The target elements here are the image elements "img" with search criteria for the property of image source "src" which contains data uri stream like:



img src="data:image/gif;base64,..."



For client-side Javascript function (supposing JQuery has been involved in your project):

function fixBase64Image() {
 var BASE64_data = /^data:.*;base64/i;
 var BASE64_Path = "base64transfer.php";
 if ($.browser.msie){
  $("img").each(function(){
   // check matched image source
   if (BASE64_data.test($(this).attr("src"))) {
    // pass image stream data to external php
    var newSrc = BASE64_Path + "?" + ($(this).attr("src")).slice(5);
    $(this).attr("src",newSrc);
   }
  });
 }
 
};

The Javascript function will repair the broken images by replacing the source path of IMG element with an external PHP request to "base64transfer.php" while Base64 image data is encapsulated in HTTP request as the query string.


For the newly created external PHP file named "base64transfer.php",five lines of code are enough to support this:

$data = splitexplode(";", $_SERVER["QUERY_STRING"]);
$type = $data[0];
$data = splitexplode(",", $data[1]);
header("Content-type: ".$type);
echo base64_decode($data[1]);


This PHP script simply converts the query string, supposing a long string of Base64 image data, into raw image stream data and then send it back to the browser for image display.


This can be considered as a quick fix to the lack of support to Base64 in-line images in Internet Explorer 6.