LFTP and the Stuck Login

I have been working on a new backup management system that utilizes the Synology and its ability to schedule tasks recently. Whilst I am untimely working on a program written in Go to be able to manage multiple backup configurations utilizing multiple backup protocols to achieve my goal I have been playing with the underlying software and protocols outside this program. One such piece of software is LFTP, this software allows for the transfer of files utilizing the FTP, FTPs, sFTP, HTTP, HTTPS and other protocols but the afore mentioned ones are the ones that are important for the software I am writing, but most importantly it supports mirroring with the FTP series protocols

Whilst I am writing this software I still wanted to get backups of the system running, to this end I was testing the LFTP commands and I hit an issue where the system will simply not connect to the server, yet the regular FTP client works fine.

Firstly we have to understand that LFTP does not connect to the server until the first command is issued, in the case of the example below, this was ls. Once this command is issued LFTP attempts to connect to and log in to the server, and this is where the issue happens, LFTP just hangs at “Logging In”

user@server backupfolder$ lftp -u username,password ftp.hostname.comlftp username@ftp.hostname.com:~> ls`ls' at 0 [Logging in...] 

To work out what the issues I had to do a little research and it comes down the fact the LFTP wants to default to secure connections, which in and of itself is not a bad thing, in fact it is a good thing but many FTP servers are yet to implement the sFTP/FTPs protocols and as such we end up with a hang at login. There is, however, two ways to fix this.

The first way to fix this is to turn off FTP for this connection only which is done through the modified connect command of

lftp -e "set ftp:ssl-allow=off;" -u username,password ftp.hostname.com

This is best if you are dealing with predominantly secure sites, however as I said most FTP servers are still utilising the older insecure FTP protocol at which point it may be more beneficial to change the LFTP configuration to default to insecure mode (and then enable it if needed for the secure connections, depends on which you have more of). To do this we need to edit the LFTP config file, to do this do the following

Utilising your favorite text editor (vi, nano or whatever it matters not) the config file is at /etc/lftp.conf

At some point in the file (I suggest at the end) put the following line

set ftp:ssl-allow false

Save your configuration and the defaulting to secure is turned off and your LFTP connection should work

Have Fun

Justin

The Old Backup Regime

After I purchased the NAS box to place at home for my work data (there is a separate one for family data, they do however backup to each other but I will cover that in another post) I decommissioned my old Windows Server 2008 R2 box.

This box, however, did do a multitude of things that were controlled via scheduled tasks and scripts that I have now moved to the Synology. Chief amongst this was the backup for several websites for “just for when” something goes wrong.

There were several bits of software in the implementation of this task, these were (are);

  • wget (Windows Version) – Command line utility for downloading files, whilst there are other options, this was quick and simple, exactly what I needed
  • FTPSync (CyberKiko) – a Great little piece of software, can display a GUI showing sync progress which is useful for troubleshooting or runs in a silent mode with no GUI. It utilises simple ini text files for configuration (it encrypts the password) making it easy to configure and it has many options for doing this configuration
  • DeleteMe (CyberKiko) – Simple file removal tool, give it a folder (it can have multiple set up) and a maximum age of the files in that folder and it will remove anything older than that.
  • 7Zip (Command Line Version) – Command Line zip archive creation utility, what more is there to say
  • Custom PHP DB Export Scripts  – Custom PHP scripts that pulls the database(s) out of MySQL and zips it up. This was originally run with a CRON job, but I found it easier to use wget to pull the trigger file when I wanted the backup was then created, then pull the file itself, then pull a delete trigger

That’s it for the software I use but what about the backup process itself? For each of the sites, I need to backup the custom PHP scripts were configured on the server. Then a custom batch file containing a bunch of commands (or should that be a batch of commands) to download and archive the files.

The batch file had the following segments in it to achieve the end goal;

  1.  Check if backup is still running from previous attempt (Utilizes blank text file that is created at start of script and then removed at end)
    1. If it is running, skip the script and go to the end of the file
    2. If a backup job is not running, create the file locking out other jobs.
  2. Run cleanup of old files
  3. If an existing backup directory for today exists (due to a failed backup job most likely), remove it and create a new one
  4. Start logging output to a log file
  5. Start Repeating Process (Repeats once for each site that is being backed up)
    1. Generate Database Backup
    2. Retrieve Database Backup
    3. Remove Database Backup to the long term storage folder
    4. Rename Database Backup File
    5. Move Database Backup File to Storage Location
    6. Sync (utilizing FTPSync) the sites directories
    7. Remove Existing zipped backup file of the site’s files and directories if it exists
    8. Zip folder structure and files for the website put the ZIP file in the long term storage folder
  6. Copy Backup Complete information to log file
  7. Remove Process Lock File

To download the batch files, click here

Reasonably simple, to add a new site, copy and paste a previous one, update a few details and off you go.

Now I realize that some of this is perhaps not the best or most secure way to achieve a goal (specifically how I was handling the database) but it was quick, easy and it worked. I could have also made the whole process more efficient by using config a files and a for loop, but well I didn’t

Have Fun

Justin

 

Ubuntu 16.04 Server LTS: Generation 2 Hyper-V VM Install

So you have downloaded Ubuntu 16.04 and noticed supports EFI, yet when you try to boot from the ISO message, you are greeted with a message stating that the machine does not detect it as an EFI capable disk, as shown below

 

Luckly this is an easy fix, as it is simply secure boot that Ubuntu/Hyper-V are having an argument over.

Turning off your VM, open up the settings page and navigate to the “Firmware” menu. As you can see in the first image below, “Secure Boot” is enabled (checked). To fix this, simply uncheck it as per the second image below, click “Apply” then “Ok”
Upon doing this and restarting your virtual machine, you will now be presented with the boot menu from the disk, allowing you to continue on your way

Have Fun

Justin

The Hidden Cost of the Raspberry Pi (and other “cheap” SBC’s)

The Raspberry Pi and other small single board computers have really taken off in the past few years, especially with the burgeoning wave of development, both commercial, but mainly hobbyist of the Internet of Things (IoT) arena.

Now Raspberry Pi (I am focusing on RPi here because it kicked off the whole shebang in a big way, small SBC’s existed before then but they were not as widely available or used) was never intended to be a IoT board, it was originally intended to be used to teach programming to children. The success of this original project (with over 5 million, yes that is 5,000,000 sold) has not only spawned a myriad of projects but a whole bunch of clones and similar devices looking to capitalize on the success of the project.

With the hobbyist community getting a hold of these devices and putting them into various projects one has to question the cost of these devices. The devices for those who do not know cost US$25 or US$35 depending on the board revision however you also need to add a SD card (either standard or micro depending on revision), power supply, case (enclosure) and if needed a USB wireless dongle and you are looking at getting towards US$100, not as cheap as it sounds to be, and that’s in a basic headless configuration.

The other side to this is the environmental cost, with all these devices (remember there are 5 million RPi’s alone) floating around that will at some point in there lives end up being thrown out, and mostly into landfill it is not overly environmentally cost effective with all those electronics leaching chemicals and other materials over time. What causes this, upgrades to newer models or migrations to other platforms, or even loss of interest, the result is the same.

Now don’t get me wrong, I am not saying these systems are all wasted, or all an issue. Many interesting projects and products are developed from them, not to mention the education that people get from developing on and for these systems. What I am saying is that their use should be more specialized to where the processing power is actually required or used to aggregate the data (done through a technology such as MQTT), cache it and forward it to a more powerful management system (home server anyone).

Further to this, the idea here merges nicely with my move to containers (Docker) and my continuing work with Virtual Machines. If we take the services the RPi runs for each function and put them into a container, and that container syncing through either MQTT or directly through the applications services to a micro controller which then carries out the functions.

Why is this more efficient, because the micro controller only needs to be dumb, it needs to either read the data on the interface and report it to the server, or turn an interface on or off (or perhaps “write” a PWM value) to perform a function. This micro controller does not need to be replaced or changed when changing or upgrading the server, and can even be re-tasked to do something else without reprogramming the controller and only changing the functions and code on the mother controller node.

Much more efficient and effective. It does however have the downfall of an extra failure point so some simple smarts on the micro controller would be a good idea to allow it to function without the mother controller in the event of a failure but the MQTT controls are agnostic so we can work with that, at least for monitoring.

Opinions?

Justin

The Home Server Conundrum

Servers have been in the home for just as long as they have been in the business’ but for the most part they have been confined to home lab’s and to the homes of systems admins, and the more serious hobbyists.

However, with more and more devices entering the modern “connected” home, it is time to once again consider, is it time for the server to into the home. Whilst some companies are, and have been starting to make inroads and push their products into the home market segment, most notably Microsoft and their partners with the “Windows Home Server” systems.

Further to this modern Network Attached Storage (NAS) devices are becoming more and more powerful, leading to their manufacturers not only publishing their own software for the devices, but thriving communities growing up around them and implementing their own software on them, Synology and the SynoCommunity for example.

These devices are still however limited to running specially packaged software, and in many cases are missing the features from other systems. I know this is often by design, as one manufacturer does not want their “killer app” on competitors system.

Specifically what I am thinking of with the above statement is some of the features of the Windows Home Server and Essentials Server from Microsoft, as many homes are “Microsoft” shops, yet many homes also have one or more Apple devices (here I am thinking specifically iPads/iPhones) and given the limited bandwidth and data transfer available to most people, an Apple Caching Server would be of benefit.

Now sure you could run these on multiple servers, or even existing hardware that you have around the house, but then you have multiple devices running and chewing up power. Which in this day and age of ever increasing electricity bills and the purported environmental costs of power, is less than ideal.

These issues could at least be partly alleviated by the use of enterprise level technologies such as virtualisation and containerisation, however these are well beyond the management skills for the average home user to implement and manage. Not to mention that some companies (I am looking at you here Apple) do not allow their software to run on “generic” hardware, well at least within the terms of the licencing agreement, nor do they offer a way to do this legally by purchasing a licence.

Virtualisation also allows extra “machines” to run such as Sophos UTM for security and management on the network.

Home server are also going to become more and more important to act as a bridge or conduit for Internet of Things products to gain access to the internet. Now sure the products could talk directly back to the servers, and in many cases this will be fine if they can respond locally, and where required cache their own data in the case of a loss of connection to the main servers either through the servers themselves, or the internet connection in general being down.

However what I expect to develop over a longer period is more of a hybrid approach, with a server in the home acting as a local system providing local access to functions and data caching, whilst syncing and reporting to an internet based system for out of house control. I suggest this as many people do not have the ability to manage an externally accessible server, so it is more secure to use a professionally hosted one that then talks to the local one over a secure connection.

But more on that in another article as we are talking about the home server here. So why did I bring it up? Containerisation; many of these devices will want to run with their own “server” software or similar, and the easiest way to manage this is going to be through containerisation of the services on a platform such as Docker. This is especially true now that Docker commands and alike are coming to Windows Server systems it will provide a basically agnostic method and language to set up and maintain the services.

This also bring with it questions about moving houses, and the on-boarding of devices from one tenant or owner of the property to another one. Does the server become a piece of house equipment, staying with the property when you move out, do you create an “image” for the new occupier to run on their device to configure it to manage all the local devices, do you again run two servers, a personal one that moves with you, and a smaller one that runs all the “smarts” of the house that then links to your server and presents the devices to your equipment? What about switching gear, especially if your devices use PoE(+) for power supply? So many questions, but these are for another day.

For all this to work however we need to not only work all these issues out, but for the regular users the user interface to these systems, and the user experience is going to be a major deciding factor. That and we need a bunch of standards so that users can change the UI/Controller and still have all the devices work as one would expect.

So far for the most part the current systems have done an admirable job for this, but they are still a little to “techie” for the average user, and will need to improve.

There is a lot of potential for the home server in the coming years, and I believe it is becoming more and more necessary to have one, but there is still a lot of work to do before the become a ubiquitous device.

The Case of the Hijacked Internet Explorer (IE) Default Browser Message

I recently had a case of a hijacked Default Browser message (the one that asks you to set the browser as default) in Internet Explorer (IE) 11 on a Windows 8.1 machine. Now that is not to say that it cannot happen to other versions of Windows, Internet Explorer or even other browsers, but this fix will clear the Internet Explorer issue.

With many of these things, the cause of this is malware, and the user doing or rather installing or running something they shouldn’t be (what they wanted the software for was perfectly OK, its just they got stung by the malware).

Anyway the issue presented like this;

The Hijacked page, remember do not click on any links
The Hijacked page, remember do not click on any links

IMPORTANT NOTE: Now first things first. DO NOT click on any of the links in the page. It is also important to note that even if Internet Explorer is the default browser, or you have told it not to bother you, it will still appear.

Now the first step in this is understanding what has happened, which in this case is that the iframe.dll file has be hijacked, either through modification or replacement (which indicates that the program would have had to have gone through UAC and the user OK’ing the change), specifically it seems that the page is being redirected, but I cannot confirm this as it was more important to fix the issue than it was to find out the technical reasons why

None the less the first step is to run a malware cleaner, specifically I use Malwarebytes, and I did a cleanup of the system with CCleaner for good measure, but it is important to note that this is just to clean up other things that the malware may have left behind, it is not to fix this problem.

As this problem resides in what is a reasonably well protected file, the best way to fix the issue is with Microsoft’s built-in System File Checker (SFC) tool.

It is actually rather simple to fix this error;

Open a Command Prompt window as Administrator

Open an Administrative Command Prompt
Open an Administrative Command Prompt

Once you are in the command prompt type;

sfc /scannow

Type sfc /scannow
Type sfc /scannow

This tool will now run and verify the files Microsoft has put into the system to validate they are the correct files, if they are not and have been replaced or otherwise modified, it will replace them with the original file. This process may take some time depending on the hardware you are running it on

SFC Running - This may take a while
SFC Running – This may take a while

Once complete, you need to restart the PC, and the SFC tool tells you as much

SFC has completed it task, now it wants you to reboot your PC
SFC has completed it task, now it wants you to reboot your PC

Restart your PC and the offending window will now be replaced with the default Microsoft one. Now how I said before it seems to override/overwrite the setting telling Internet Explorer not to display the defaultbrowser.htm tab (either because it is default, or you have told it not to check). This continues on here, because that setting was tampered with by the malware it will display the default browser page, to clear this you either simply need to tell it to not display it, or go through the set as default process.

Enjoy

Justin

%d bloggers like this: