Firefox needs UTF8 for GeoXml3

I’m developing an WebAPI2 solution with Angular that receives, from Android devices, GPS points.
So Latitude , Longitude , Speed etc. are saved in a SQL Server 2014 table; from the web site the headquarter would follow the real time navigation.
The chosen approach was to write an KML file from the db data, and show this KML in a GoogleMaps window.
Having local files (not on another domain) the best solution is to use the GeoXml3 library, which is relatively easy to use (examples here and here).
For the javascript part, no problem: there is an simple setTimeout that calls via jQuery an WebAPI2 method that re-write the kml (the name is the record ID), and then adding an random number to the querystring the boss can follow the persons wandering somewhere, seeing the kml line that slowly progress on the map.
But there was an problem: On IE10 , 11, Chrome, Edge all ok; instead on Firefox (Windows or Ubuntu or Android tablet) no kml shown.
Sometimes the things are discovered in a very casual manner…i tried to save the KML with another name using Notepad++ , and change the code in order to not use the API call and instead try to show immediately this second file where i wanted to test some change, in order to understand what was wrong : bingo, KML immediately visible!
This was a real head scratching…the files (the generated from the system, and the one saved from the previous) in Notepad++ was absolutely the same, no visible difference.
So i tried to compare them with WinMerge and surprise, the generated file was full of strange characters before every normal character, immediately i realized “oops, the encoding?” and only after i noticed in Notepad++ that in the Encoding menu the file saved from Notepad++ itself was UTF8 encoded, and the one produced from the C# code was without encoding.
In the WebAPI2 method effectively i used Unicode: so now, all the browsers shows the KML writing the file as:

private async Task WriteTextAsync(string filePath, string text)
    //byte[] encodedText = Encoding.Unicode.GetBytes(text);
    byte[] encodedText = Encoding.UTF8.GetBytes(text);
    using (FileStream sourceStream = new FileStream(filePath, FileMode.Append, FileAccess.Write, FileShare.None, bufferSize: 1048576, useAsync: true))
        await sourceStream.WriteAsync(encodedText, 0, encodedText.Length);

2016, the Cloud era

Categories: Uncategorized

Regex grouping for automatic coding

Sometimes is requested to do ugly and repetitive tasks.
For example there were some .aspx pages where in a DataGrid was written the DataField for monthly columns, but not the HeaderText (i don’t know why..).
Copy & paste, ok only 13 columns: but a lot of .aspx pages.
So i thinked to an substitution using regular expressions.
Tipically we have a string delimited with a double quotation mark, that we can express as




Both are explainable as: the string begin with a ‘ or an ” , there could be an variable amount of chars or numbers and then there is a final ‘ ( or “)
The trick is in the parenthesis () , that isolates an regex group.
I use an old but valid tool, the Rad Software Regular Expressions Designer (the website is no more available), the basilar syntax is
If we introduce an group
We have surrounded with () the [“‘] expression, and there is a group ‘1’ referred to the char “.
The \1 is a regex reference to the first group (if you delete the parenthesis around [“‘] there is an error while evaluating the regex).
By using
We isolate the text between the ” couple.
Using the regex expression $<number> we can refer to these groups, so in Notepad++ i used
And for every DataField with the Replace All is created the HeaderText with the same month name.

Categories: RegEx, VB.NET

Delete hidden VM Image

I was experimenting with Azure, and i created an VM Windows Server 200 R2, for a customer demo.
Giving the fact that the customer is not ready for the cloud … :-)… , i deleted the vm in order to not leave unuseful objects in the portal.
The problem was that i made an vm image, in order to eventually reset to the initial conditions, and after the vm was deleted in the Azure portal there was still the Storage account:
27-11-2015 09-58-32
Then i tried to delete also this storage, but i received
27-11-2015 09-58-51
Let’s search for this vm image.. but apparently this vm image was a ghost: where to find it? In the CONTAINERS tab for the storage account there is nothing, and even exploring the preview portal i found nothing about vm images.
Ok , i’m a beginner with Azure, but should be provided some interface in order to easily discover these things.
After a while i discovered that when you create a new VM from the Gallery
27-11-2015 09-59-53
In the Gallery there are also your VM images:
27-11-2015 10-00-14
And so you can re-read the VM name
With the Powershell command Get-AzureVMImage , filtered by the name , effectively there were all the info about:
27-11-2015 10-14-12
Fortunately Azure PowerShell gives more opportunities than the portal: at this point i deleted the image with Remove-AzureVMImage
27-11-2015 10-16-14
So finally i was able to get rid of the Storage account from the Azure portal:
27-11-2015 10-17-00

Categories: Azure

Autumn colors

In northern Italy

Categories: Uncategorized

xp_cmdshell strange problem

I was working on this request: from a ship are unloaded many containers, it is needed to track the ones not still invoiced; in this case must be sent an email with an Excel file as attachment, containing the list of these containers.
The containers are register in a table of an SQL Server 2008 SP3 10.0.5520.0 (X64).
I tried OPENQUERY but without success (it seems that read and write Excel with OPENQUERY is a sort of black magic) so , giving also the fact that in every case there were many steps too complex to achieve with TSQL (CreateObject and CLR stored procedures were not acceptable options) i used an SSIS package , without using a SSIS catalog: it must be used dtexec.
I shoud try with SQL Server 2014, but with SQL 2008 there is this problem and double quoted strings are tipically requested in a dtexec command line: so i use the trick of the old MSDOS command ECHO to create a batch file and then execute this .bat from xp_cmdshell.
My TSQL is :

DECLARE @varSql nvarchar(500)
DECLARE @varFileName nvarchar(20)
DECLARE @varEcho nvarchar(1000)
DECLARE @XlsFileName nvarchar(500) 
DECLARE @intRes int
SET @varFileName = 'D:\SSIS\ContainersWithoutBilling\batches\' + CAST(@p_Voyage AS varchar) + '.bat'
SET @XlsFileName = 'D:\SSIS\ContainersWithoutBilling\Voyages\' + CAST(@p_Voyage AS varchar) + '_NotInvoiced.xlsx'
SET @varSql = '' + CHAR(34) + 'C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\dtexec' + CHAR(34) + ' /FILE ' + CHAR(34) + 
				'D:\SSIS\ContainersWithoutBilling\SSISPkg\bin\Package.dtsx' + CHAR(34) + ' /CHECKPOINTING OFF  /REPORTING EW  /SET \Package.Variables[IDVoyage].Value;' + 
				CAST(@p_Voyage AS varchar)
SELECT @varEcho = 'ECHO ' + @varSql  + ' &amp;amp;gt; ' + @varFileName 
EXEC @intRes =  master..xp_cmdshell @varEcho , no_output	

At this point we have an batch file, for example is generated something as this 116243.bat:

"C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\dtexec" /FILE "D:\SSIS\ContainersWithoutBilling\SSISPkg\bin\Package.dtsx" /CHECKPOINTING OFF  /REPORTING EW  /SET \Package.Variables[IDVoyage].Value;116243

This example works without problems launched from the File Explorer, and you can think that with

EXEC @intRes = master..xp_cmdshell @varFileName, no_output

your problems are resolved, instead i was surprised that sometimes the .bat file, that was perfectly working launched by hand, under xp_cmdshell the process hangs: and i was constrained to kill both the cmd and the dtexec processes.
Tipically a xp_cmdshell is blocked if is requested an user input , but this was not the case.
After some head scratching, i tried with an temporary job:

DECLARE @jobID uniqueidentifier
DECLARE @cmd varchar(1000) 
SET @cmd = 'D:\SSIS\ContainersWithoutBilling\batches\' + CAST(@p_Voyage AS varchar) + '.bat'
EXEC msdb.dbo.sp_add_job @job_name = '_tmp_batch', @enabled  = 1, @start_step_id = 1, @owner_login_name='sa', @job_id = @jobID OUTPUT 
EXEC msdb.dbo.sp_add_jobstep @job_id = @jobID, @step_name = 'launch batch', @step_id = 1, @subsystem = 'CMDEXEC', @command = @cmd
EXEC msdb.dbo.sp_add_jobserver @job_id = @jobID
EXEC msdb.dbo.sp_start_job @job_id = @jobID, @output_flag = 0 
WAITFOR DELAY '00:00:30' -- wait an reasonable time of 30 seconds until the job should be securely completed
IF EXISTS (SELECT name FROM msdb.dbo.sysjobs WHERE name = '_tmp_batch')
	EXEC msdb.dbo.sp_delete_job @job_name = '_tmp_batch'
-- or you could delete if exists the temp job before the creation, and avoid the
-- WAITFOR if there are not other instructions that depends from the job work

and this is working also for the .bat files that hangs under xp_xmdshell.
I think that using an SSIS catalog there aren’t these problems, but in this case the dtexec was mandatory.
At the end it seems that xp_cmdshell has some bug, and an temporary job is always a better choice.

Categories: SQL Server, SSIS

20 years ago

It was the day of Windows 95 (24 August you remember “start me up” of the Rolling Stones?)
After the dark ages of the File Manager we got the Start button, long file names, and so on.
.NET was not even to the horizon, COM was the leading technology , 4mb was the standard pc memory instead of 4gb but Windows 10 is still using some proof of concepts of his great-grandfather.

Categories: Uncategorized

Error creating an Azure Cloud Service with 2.7 SDK

From some days is available the Azure 2.7 SDK, and i’m trying it with Visual Studio 2013: when Visual Studio was alerting that were available this update i installed it from inside Visual Studio.
Creating an Azure Cloud Service i got the error “Microsoft Azure Tools: Error: The installed Microsoft Azure Compute Emulator does not support the role binaries. Please install the latest Microsoft Azure Compute Emulator and try again.”.
I don’t know if is the real solution valid for all the cases, but anyway re-downloading the Microsoft Azure SDK for .NET – 2.7  from but installing only the MicrosoftAzureComputeEmulator-x64.exe (x86 if you are using an 32 bit environment, but who has today an 32 bit pc?) finally the project has started.

Categories: .NET, Azure, Visual Studio 2013

10 days with 10

On 29 July i tried to update my Windows 8.1 to Windows 10, downloading the ISO from MAPS.

It took from 13:00 to 17:30, on my HP Pavilion: i think that the long time is due to Office 2013, Visual Studio 2013 and others installed software.
This is the initial phase:

After 10 days the impressions are:
– less ugly than Windows 8.1: is less bidimensional/two colors only
07-08-2015 21-58-38
– faster, even if is sometimes difficult to say where ends the real advantage and begins the placebo effect: in every case it seems that there is a better use of threading.
– Edge: very valid, fast; i tried many sites without problems.
– Cortana: in a normal notebook, used with the mouse, is not of interest.
– I’m still not satisfied with the Windows Search: why i can’t have a true details view of the search results, with sortable columns? in the past it was possible.. if there is this possibility, is well hidden.
“This Pc” instead of “Computer” was the first problem… but after 10 minutes i was already confident with the new Windows.
After the installation the c: drive could be filled to the limit, because is created a folder for the old Windows, but i think that no one will desire the old 8.1: this article explains how to free up disk space.
The network icon on the taskbar is gone and the network settings are not simple to found at a first sight, and i don’t understand the ugly B/W look of the “Settings” when the Control Panel seems the Windows 7 old counterpart.
I’m an VmWare Workstation (currently 11.1.2) user, and on the same pc upgraded from 8.1 to 10 the “Unity mode” is more fluid with 10, the windows moved on another screen are working in a more fluid manner.
Generally speaking you can sense an “work in progress” feeling, but this first initial version is already good.

Categories: Uncategorized

Get every new post delivered to your Inbox.