URL Monit@r V1.2.x.x Work progress
This is part of the future of URL Monit@r update's life


Here is phase 2 of URL Monit@r in progress.

It is a major update, but not strong enough to justify V2.0.0.0, so the version will probably be V1.2.x.x

Some update might occurs earlier if I found bug or a bug is reported as it is the case with V1.2.2.0 born from a bug found in V1.2.1.0.


Do not forget this tool is design for all of US... citizen of planet earth, and I encourage the community to propose idea to make this tool one of the best tool in it's category... always free.

Your idea, if done, will be credited on URL Monit@r home page and inside the software or not if you don't want.

Suggestion are always welcome, including suggestion for this work in progress. Working hand in hand with a devloper is here possible.

Here is my email address link: email the author


Phase 2 is still in progress.

For the moment I am working on, in non specific order:


What I am working on intensively is in bold. What is scrapped has been released earlier.

This page is a bit like the history progress version of URL Monit@r.
So I add information at the end and start of this page as I go with progress of upgrade and updates as I go.

One of the feature I want to implement inside the software is an history log of any capture. One of the field should contains the domain and the IP address of this domain.
So, I was pondering how to perform the task for quite a while and thought about 3 options:

  1. Will I go with some JSON Api on line?
  2. Parse a who is page?
  3. Build Internal function to avoid JSON api and parsing web page?

If the first two options where pretty obvious, they did represent no challenge while also provoking write into the cache of Internet. And I did want to avoid that.

So, I decided to write function and to be honest I surprised myself how easy it was to do such function and how light it is to compare to the very first two choices. I created a prototype to test the function and found it so good I will probably include it into the software as a tool.
But no more waiting, below is two snapshot: one is the prototype and the second verification online via a Who is site of the result.

The prototype in action. The domain is really google.com.au linked to google.com. The proto does not care for now, it  just get the job done. First time I received and upon a retry I received Same domain, various server IP, same location.

Result from the prototype verified on line.


Various IP from various server but same location.

One of the funny thing when you use api mixed with your own code is error debugging, so I disconnected the PC and try the proto... Everything was fine it game me the domain but not the IP, what was expected. So for the fun I deleted the domain field and input random data... I was surprise it was managed by the api I called mixed with my function. Instead of an ugly error, I had my own LAN IP showing. The two screenshot below explain it all. One blank input and one random input into the domain field. Same function, same api, two different results... In fact I had two errors one indicating my own LAN IP and the other a no connection error. What you see is management of the error with obtained results in the IP result field. It just should return a No Connection error for both but okay I like the result as it is. It is undocumented by the way. I tried to find a written explanation but no documentation exists. You could call this an eastern egg but I call it unexpected good error result. Micro... is sometime funny.

No data provided, the api return my own IP address (LAN exposed to WAN)

   Random Data, result is what I expected in both case.

As the prototype is fast I tweaked a little bit for PING support, it is sending ping via raw socket and needs elevation for now but it works 

Some feature might be put to phase 3. This is all relative to private time investment, the will to do it, users feedback.

The implementation of the Capture database is done. Still lots of to do (search record, delete record, save history, export history, URL field to adapt better and so on). Now URL Monit@r record every URL entry it received: from the clipboard or from the Integrated browser.
Mainly it does record:


This release V1.2.3.0 of URL Monit@r

Evolution stills in progress but it starts to look the way I want

Moreover you can now:

To accomodate this new tool of URL Monit@r, the Options of the main screen has been rearranged has shown below.

To be honest, I should have adopted this look earlier but hesitated lots. Now, it's done.

A more neat GUI for the Options. Please note the clock, it is how you access the Capture history.

Please also note the Capture URL history options:

URL Monit@r can now minimize to the taskbar or the  sytem tray

Right click on the tray icon shows the context menu...

...Each URL captured are also displayed here via a balloon, this can of course turned be off.

PLEASE note that the only way to exit the software when the option 'Exit minimise the software to tray' is ticked is the Exit button of the context menu.

For now I am working on a function to allow the software to run only once and restore the software if the user try to run a second copy.
It will be a brand new Utility Unit I could use later with other software.
As usual guild from ground zero and not depending on any other unit existing out there but with what is offered in delphi and my components and own code of course. So the code source can be used later to recompile for other OS. Therefore it takes a little bit of time even so far it is not really hard to do and already implemented. I just need time for testing on various PC the results of my work and optimise the code for reading purppose and performance.

All the shortcut key of the popup context menu are not system wide hot key.
I am creating an options windows ruling them plus others...
You would be able to choose of course you hot key instead of the proposed default one.

This page will talk about it.

Stay tune.


Return to URL Monit@r Home page

Return to the Main page

Return to the Software Index


All rights reserved. Copyright ©2016 Benoit Standaert.

Web Counter
Web Counter