Tips and Tricks in a world of Mix

When you need a test running during your process to insert data you may want to use task for async insertion.

Also you may run it parallel to your process from outside.

Is you want to be able to stop you should implement Cancelation Token and put it inside the process as a stop condition for a while loop.


namespace AsyncTest
		public class InitTestPerSecond{
				public Task initData;
				public InitTestPerSecond()
						initData = new Task()

namespace MyDAL
		public class MyDAL{
				public static void InsertInTableEachSecond(DateTime timestamp)
								OracleConnection localConn = new OracleConnection(Utils.GetConnectionString());
								int maxId;
								using (OracleCommand maxIdCommand = new OracleCommand(@"SELECT MAX(MYID) FROM SCHEMA_NAME.MYTABLE", localConn))
										maxId = int.Parse(maxIdCommand.ExecuteScalar().ToString());
								using (OracleCommand initMyData = localConn.CreateCommand())
								    initMyData = string.Format(@" INSERT INTO SCHEMA_NAME.MYTABLE (MYID , COL1,  DATE)
																		'{0} test ' ,
																		{1})", ++maxId ,Utils.GetDateFormat(timeStamp) ));
						catch(Exception ex){
								throw ex;

public class Utils{
		public static string GetDateFormat(DateTime date){
				return string.Format(@"TO_DATE('{0}', 'MM/DD/YYYY HH24:MI:SS')", date.ToString("MM/dd/YYYY HH:mm:ss"));

       public static string GetConnectionString()
			return ConnectionStringHandler.ConnectionString;

public  static class ConnectionStringHandler
          public static string ConnectionString= String.Empty;
          static ConnectionStringHandler()
               ConnectionString = ConfigurationManager.ConnectionStrings["MyConn"].ConnectionString;

useful sql

Backup data

If NEW_TABLE already exists then …

insert into new_table select * from old_table

If you want to create NEW_TABLE based on the records in OLD_TABLE …

create table new_table as select * from old_table


FROM YourTable

SELECT replace(YourColumn,CHAR(13),'')
FROM YourTable

update YourTable
set YourColumn =replace(YourColumn,CHAR(13),'')




Potential Blacklist
Special control chars 0-31 by Unicode must be handled
White list 
[^a-zA-Zא-ת0-9._] -and punctuation chars by your choice
- this is a start of a regex pattern - [^] -everything that not in the pattern is not allowed.
also can do 
[^\w\s\p{IsHebrew}\p{P}] - not taking accesnt chars like ~^ , not finished , but a start

better handled by StringBuilder with loop through chars than Regex 
(more than 10 times better).just Whitelist of allowed chars and if in range 
like this 

if(c>=a && c<=z)...
  sb.append(' '); //replacement char


For  usage of commands in cmd with ElasticSearch you should install curl.

Better put it in c:\ for convenience .

You should download ES   from here and install it.

You can check it in chrome – http://localhost:9200/


run it in cmd  c:\curl -X GET http://localhost:9200/

Run few commands to automatize the process and make it easier to work with :

1) running as a service on windows

c:\elasticsearch-{version}bin>service install
2)install plugins 
You better get the basics 
bin/plugin --install mobz/elasticsearch-head
bin/plugin --install lukas-vlcek/bigdesk


installing marvel will give you the tools that ES itself promotes

bin/plugin -i elasticsearch/marvel/latest


3)In VS2013 quick launch – type nuget – choose Package Manager


Install-Package NEST –Version  1.2.1

it will get you the .net api for elasticsearh. the version is optional . without it gets the latest  version

public static void Encrypt(){

           Configuration objConfiguration = null;

           if(HttpContext.Current!=null)        //if we are in web application of some sort

               objConfiguration = WebConfigurationManager.OpenWebConfiguration (“~”);

           else       ///if not web

                objConfiguration = ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.None);

           ConfigurationSection objSection = “connectionStrings”;

           //if encrypted already don’t do it again

           if(objSection!=null && !objSection.SectionInformation.IsProtected){


                           objSection.SectionInformation.ForseSave = true;





public static void Encrypt(){

                     Configuration objConfiguration = null;

                       if(HttpContext.Current!=null) //if we are in web application of some sort

                                   objConfiguration = WebConfigurationManager.OpenWebConfiguration (“~”);

                       else ///if not web

                                    objConfiguration = ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.None);

                      ConfigurationSection objSection = “connectionStrings”;

                           //if encrypted then decrypt

                            if(objSection!=null && !objSection.SectionInformation.IsProtected){


  //objConfiguration.Save();  – you can save the change but it will reset all the global data of your application in iis (relevant to comment it for web applications,can stay on in others)



So the mission was to set static member for connectionString , so that we’ll get the string from web.config just once. On the way we did Encrypt and Decrypt of the connectionString.

So the trouble was that we couldn’t be sure that the static field is staying or going on the request- response model of WebApi.

The easy solution was to reuse the existing code to init HttpContext.Current.Application[“connectionString”] value if it’s null at the global.asax at Application_Start function.

When initialized in that function at the host/server side of the distributed system , the HttpContext.Current.Application[“connectionString”] is initialized for the first time and until the IISReset.

If you want something constant for one request the way to go is to put it in Init function at gobal.asax 

The trouble was that the Application variable has been resetting itself each request.

The solution was that because of decryption that has been saving the web.config it has been resetting the site values held by the IIS, so actually it has been recycling the site data resetting the Application variables values also.

At the end the encryption has been executed on publish to the server, the decrypt has been decrypting the secured section but hasn’t been saving it back , so the web.config actually haven’t changed throughout running host , so the Application data kept intact.

So , after the last post about elasticsearch that explains a bit the terms of the technology , I’m getting to real life problems.

So after I’ve done entering the data into the elasticsearch at the last post I have now to delete it all! Oh my , how did that happened!  What shell I do now ?


Well if you are only starting it’s not that bad , just delete the index , which will remove the data and existing mapping as well.

curl –XDELETE “http://localhost:9200/test”


One of the demands was to make the data searchable by only few chars , and not the whole word.

So .. ?

Well actually that means that the default indexing that occurred while creating the index is not good enough , we should have defined the index settings manually suggesting from the beginning what kind of analysis should be performed on the index .


The nGram allows as to break the data that we enter to small tokens witch we can search later. So if you have


and define nGram min- 7 , max – 20  ==> you”ll get indexed “[Jerusal, Jerusale, Jerusalem]”

Of course more logical is to start with two chars nGram min and go on..

Tried to set the indexing and mapping into one file but it failed with mistake

Analyzer [your_analyzer_name] not found for field [_all]

When I split it ,it worked.

To the last post I added the manual index definition : CreateIndex.js:


“index” : {

               “name” : “test”,

               “number_of_shrads”: 1,

               “settings” : {


                                                         “filter”: {


                                                                                             “type” : “nGram”,



                                                                                               “token_chars”: [ “letter”, “digit” , “punctuation”, “symbol”]





                                                       “your_name_for_index_analyzer”: {


                                                                      “tokenizer” : “whitespace” ,

                                                                      “filter”: [“lowercase”, “asciifolding” , “your_name_for_nGram_filter”]

                                                                          } ,

                                                       “your_name_for_search_analyzer”: {


                                                                      “tokenizer” : “whitespace” ,

                                                                      “filter”: [“lowercase”, “asciifolding” ]




Than you run the curl to enter it :

curl –XPUT “http://localhost:9200/test” –d @c:\pathto\CreateIndex.js


Now we have the index settings right with autocomplete suggestions starting with 2 letters.


Now we reenter the mapping and data from the last post , just add some features to the mapping ;

CreateMappings.js :




                               “_all”: {

                                              “search_analyzer”:“your_name_for_search_analyzer” ,

                                              “index_analyzer”:“your_name_for_index_analyzer” ,







                                                          “always_in_query_field” :{





          Than you run the curl

curl –XPUT “http://localhost:9200/test/name_of_your_object/_mapping” –d @C:\pathto\createMappings.js



Now we’ll enter the actual data as at the last post 

curl –XPOST “http://localhost:9200/test/name_of_your_object/_bulk –data-binary @c:\pathto\formatizedToIndex.json


Now you have data with analyzers inside the elasticsearch with autocomplete .. Happy searching!

Tag Cloud


Get every new post delivered to your Inbox.