Telstra and the Routable Mobile Internet

There have been many times in the past, that for some reason or another I have needed a routable cellular connection, but these are just not easy to organise, until now. Thanks to a professional contact I have made through an industry group, we now now the code that we need applied to the account/SIM for the connection, and that code is GPTEXB3. You also need to set the APN to telstra.extranet. This has been tested by this person on MicroTik routers but hopefully should work with others. I will be giving it a shot in the near future when testing the 4GLTE backup system I have in mind to test

Justin

The VM’s Faulty Pagefile

Recently I was having an issue with one of my Virtual Machines, specifically the one I use for accounting purposes. Each time I booted the VM I would get an error stating that there was an error with the paging file. Me being me, I ignored it and continued on about my task, with a case of “its only my stuff I will deal with it later” and putting the issue out of my mind. However I then started to get errors in the software I was using, thinking, “oh great what now” I looked at a few things and then came back to this error, which as it turns out, was the cause of the symptoms I was seeing in the software package I was using.

Windows created a temporary paging file on your computer because of a problem that occured with your paging file configuration when you started your computer. The total paging file size for all disk drives may be somewhat larger than the size you specified

Now what does this mean, well it can mean one of several things. Most commonly, expecially on a VM it means that the disk is full to a point where Windows cannot create the paging file when it starts up.

Luckily this is a simple fix, which I am not going to take you through here as the way it is done is entirely dependent on your paticular hypervisor (most commonly Hyper-V, VMWare, Parallels or VirtualBox), this does however assume you have the space free to increase the size of the virtual disk, if you do not have space, you will need to clear some files off the VM to make space.

Ok, but what if you have plenty of space. Well their are two options off the top of my head that make work, one is removing and then recreating the paging file as it may have become corrupted, especially if the system had an unclean shutdown (powered off instantly such as a power failure) or you may need to run a system file check to clean up any errors.

Whilst neither are hard to do, both can take some time to complete. I would suggest starting with the system file scan as it is the easier of the two to do and the more comprehensive, but both options are outlined below

System File Scan

To do this you need to open an administrative command prompt

Open an Administrative Command Prompt
Open an Administrative Command Prompt

 

Once you are in the command prompt type;

sfc /scannow

Type sfc /scannow
Type sfc /scannow

 

This tool will now run (which may take some time), and verify the files Microsoft has put into the system to validate they are the correct files, if they are not and have been replaced or otherwise modified, it will replace them with the original file. This process may take some time depending on the hardware you are running it on

SFC Running - This may take a while
SFC Running – This may take a while

 

Once complete, you need to restart the PC, and the SFC tool tells you as much

SFC has completed it task, now it wants you to reboot your PC
SFC has completed it task, now it wants you to reboot your PC

 

Restart your PC and see if the issue has been resolved, if not you may try to manually delete and recreate the pagefile as outlined as below.

 

Manual Removal and Recreation of the Pagefile

 

Having logged into your system as an account with administrative rights (or otherwise authorised yourself for administrative access to the system panel) you need to open the system properties display on the system, or if the dialogue box with the warning pops up then click OK and the Pagefile controls will open, allowing you to skip the first section

 

    1. Firstly, if the Paging file settings display is not open we need to open it, do this by

      a) Right Clicking on
      If it’s not already open, open the virtual memory settings by rich-clicking on Computer, → Properties → Advanced System Settings → click the Advanced tab → Under Performance, click Settings, go to Advanced tab, finally under Virtual Memory section click the Change button.

 

    1. Uncheck the Automatically manage paging file size for all drives checkbox.

 

    1. Set a “Custom size” for the paging file on the C drive: 0MB initial, 0MB maximum.

 

    1. Click OK, close all dialog boxes, and restart your computer.

 

    1. After logging in again, delete the file C:\pagefile.sys
        1. To do this, you may need to change your folder settings so you can see it first. Open a window of your C: drive and click Organize at the top, then Folder and Search Options
        1. Click the View tab, and make sure Show hidden files, folders and drives is turned on, and that Hide protected system files is not checked.
      1. Click OK and go back to your C: drive, find pagefile.sys and delete it.

 

    1. Now go back to the virtual memory settings (see step 2 above) and set the paging file for the C: drive to System managed size, and then make sure the Automatically manage paging file size for all drives checkbox is checked.

 

    1. Click OK, close all dialog boxes, and restart your computer.

LFTP and the Stuck Login

I have been working on a new backup management system that utilizes the Synology and its ability to schedule tasks recently. Whilst I am untimely working on a program written in Go to be able to manage multiple backup configurations utilizing multiple backup protocols to achieve my goal I have been playing with the underlying software and protocols outside this program. One such piece of software is LFTP, this software allows for the transfer of files utilizing the FTP, FTPs, sFTP, HTTP, HTTPS and other protocols but the afore mentioned ones are the ones that are important for the software I am writing, but most importantly it supports mirroring with the FTP series protocols

Whilst I am writing this software I still wanted to get backups of the system running, to this end I was testing the LFTP commands and I hit an issue where the system will simply not connect to the server, yet the regular FTP client works fine.

Firstly we have to understand that LFTP does not connect to the server until the first command is issued, in the case of the example below, this was ls. Once this command is issued LFTP attempts to connect to and log in to the server, and this is where the issue happens, LFTP just hangs at “Logging In”

user@server backupfolder$ lftp -u username,password ftp.hostname.comlftp username@ftp.hostname.com:~> ls`ls' at 0 [Logging in...] 

To work out what the issues I had to do a little research and it comes down the fact the LFTP wants to default to secure connections, which in and of itself is not a bad thing, in fact it is a good thing but many FTP servers are yet to implement the sFTP/FTPs protocols and as such we end up with a hang at login. There is, however, two ways to fix this.

The first way to fix this is to turn off FTP for this connection only which is done through the modified connect command of

lftp -e "set ftp:ssl-allow=off;" -u username,password ftp.hostname.com

This is best if you are dealing with predominantly secure sites, however as I said most FTP servers are still utilising the older insecure FTP protocol at which point it may be more beneficial to change the LFTP configuration to default to insecure mode (and then enable it if needed for the secure connections, depends on which you have more of). To do this we need to edit the LFTP config file, to do this do the following

Utilising your favorite text editor (vi, nano or whatever it matters not) the config file is at /etc/lftp.conf

At some point in the file (I suggest at the end) put the following line

set ftp:ssl-allow false

Save your configuration and the defaulting to secure is turned off and your LFTP connection should work

Have Fun

Justin

The Old Backup Regime

After I purchased the NAS box to place at home for my work data (there is a separate one for family data, they do however backup to each other but I will cover that in another post) I decommissioned my old Windows Server 2008 R2 box.

This box, however, did do a multitude of things that were controlled via scheduled tasks and scripts that I have now moved to the Synology. Chief amongst this was the backup for several websites for “just for when” something goes wrong.

There were several bits of software in the implementation of this task, these were (are);

  • wget (Windows Version) – Command line utility for downloading files, whilst there are other options, this was quick and simple, exactly what I needed
  • FTPSync (CyberKiko) – a Great little piece of software, can display a GUI showing sync progress which is useful for troubleshooting or runs in a silent mode with no GUI. It utilises simple ini text files for configuration (it encrypts the password) making it easy to configure and it has many options for doing this configuration
  • DeleteMe (CyberKiko) – Simple file removal tool, give it a folder (it can have multiple set up) and a maximum age of the files in that folder and it will remove anything older than that.
  • 7Zip (Command Line Version) – Command Line zip archive creation utility, what more is there to say
  • Custom PHP DB Export Scripts  – Custom PHP scripts that pulls the database(s) out of MySQL and zips it up. This was originally run with a CRON job, but I found it easier to use wget to pull the trigger file when I wanted the backup was then created, then pull the file itself, then pull a delete trigger

That’s it for the software I use but what about the backup process itself? For each of the sites, I need to backup the custom PHP scripts were configured on the server. Then a custom batch file containing a bunch of commands (or should that be a batch of commands) to download and archive the files.

The batch file had the following segments in it to achieve the end goal;

  1.  Check if backup is still running from previous attempt (Utilizes blank text file that is created at start of script and then removed at end)
    1. If it is running, skip the script and go to the end of the file
    2. If a backup job is not running, create the file locking out other jobs.
  2. Run cleanup of old files
  3. If an existing backup directory for today exists (due to a failed backup job most likely), remove it and create a new one
  4. Start logging output to a log file
  5. Start Repeating Process (Repeats once for each site that is being backed up)
    1. Generate Database Backup
    2. Retrieve Database Backup
    3. Remove Database Backup to the long term storage folder
    4. Rename Database Backup File
    5. Move Database Backup File to Storage Location
    6. Sync (utilizing FTPSync) the sites directories
    7. Remove Existing zipped backup file of the site’s files and directories if it exists
    8. Zip folder structure and files for the website put the ZIP file in the long term storage folder
  6. Copy Backup Complete information to log file
  7. Remove Process Lock File

To download the batch files, click here

Reasonably simple, to add a new site, copy and paste a previous one, update a few details and off you go.

Now I realize that some of this is perhaps not the best or most secure way to achieve a goal (specifically how I was handling the database) but it was quick, easy and it worked. I could have also made the whole process more efficient by using config a files and a for loop, but well I didn’t

Have Fun

Justin

 

Installing a non-Windows Secure Boot capable EFI Virtual Machine in Hyper-V

So you have downloaded an operating system installation disk (Ubuntu 16.04.2 used in this instructional) and noticed supports EFI, yet when you try to boot from the ISO message, you are greeted with a message stating that the machine does not detect it as a valid Secure Boot capable disk, as shown below it states that “The image’s hash and certificate are not allowed”

Luckily this is an easy fix, as it is simply secure boot that Ubuntu/Hyper-V are having an argument over the validity of the Secure Boot certificate.

Check out the video I have created showing you how to do this, alternatively keep reading below for instructions and more details

 

 

Turning off your VM, open up the settings page and navigate to the “Security” menu (Server 2016). As you can see in the image below, “Secure Boot” is enabled (checked) and the template is set to “Microsoft Windows”. What this effectively does is limit the Secure Boot function to working only with an appropriately signed Microsoft Windows boot system.

To fix this, there are two options, and it depends on the operating system you are trying to install. Preferably we want to keep the benefits of Secure Boot so the best option if it works for your operating system we want to simply change the template to “Microsoft UEFI Certificate Authority” this opens up the Secure Boot option to work with a greater range of appropriately signed boot systems, as against the Microsoft Windows one exclusively. The settings for this are shown below

Click Apply and this is hopefully now work, and you can check this by running the virtual machine.

Upon booting your virtual machine, you will now be presented with the boot menu from the disk, allowing you to continue on your way

 

If this change in of the CA template for Secure Boot does not work however you may need to disable secure boot entirely.

To achieve this go back to the “Security” menu simply uncheck it as per the image below, click Apply and it should now work.

 

 

Have Fun

Justin

Access a Cisco Switch via USB Console

It may be that you want to use a USB cable, or it may be that just like me you forgot your USB to serial adapter, and now your faced with connecting to a Cisco switch with a USB cable rather than the serial cable on OSX.

Well how do we go about this, with Windows we could simply look up the port number in device manager, with OS X they do not use this reference, instead referring to the device as a TTY USB modem.

First we need to look up the device, which is contained with other devices in the folder /dev/, we also want to limit it to devices of the USB type so we are going to limit the command to that. Open terminal and type the following command;

ls -ltr /dev/*usb*

This will list all devices in the /dev/ directory (the devices directory) where it contains the key phrase usb within it, with all information, in a list with the most recently modified device (and therefore most likely the device we are looking for)

Your device will show up as something such as

tty.usbmodem.12a1

Now we have the path to the device, we need to open a console using it. In OS X the console utility screen is built in, so lets open it utilising this utility and a baud rate of 9600 which most devices will happily handle. To do this type;

screen /dev/tty.usbmodem.12a1 9600

What this command is stating is open screen on device /dev/tty/usbmodem.12a1 utilising a 9600 baud rate, no settings for stop bits etc are input, you can also utilise other baud rates if needed.

Your terminal will now connect to the console of the Cisco device, this should also however work for any other devices that utilises a USB chipset to communicate via serial emulation.

Justin

%d bloggers like this: