Python

Python

Python is an interpreted high-level programming language for general-purpose programming. It can be downloaded from Anaconda which gives a nice IDE Spyder and a Jupyter notebook. Python is also very useful for Data Science with its rich libraries for

  • Data Manipulation (NumPy, SciPy, Pandas)
  • Machine Learning (Scikit-Learn)
  • Visualization (matplotlib, Seaborn, Plotly)
  • Deep Learning (Keras, TensorFlow).

Useful Resources
1) Python Data Science Handbook, github
2) Python for Data Science for Dummies
3) Practical Machine Learning with Python
4) Mastering Machine Learning with Python in Six Steps
4) Mastering Machine Learning with scikit-learn
5) Mastering Pandas

Cheat Sheets
1) Python Basics
2) Pandas Basics
3) Pandas Advanced
4) NumPy Basics
5) SciPy – Linear Algebra
6) Importing Data
7) Matplotlib
8) Seaborn
9) Bokeh
10) Data Wrangling
11) Jupyter Notebook
12) Scikit-Learn

Useful Tips
1) Python for Data Analysis
2) NumPy Tutorial Part 1: Introduction to Arrays
3) NumPy Tutorial Part 2: Vital functions for Data Analysis
4) Select Pandas Dataframes Columns and Rows using loc and iloc
5) Merging Pandas Dataframes
6) Summarising, Aggregating, and Grouping data in Python Pandas
7) How to rewrite SQL Queries in Pandas
8) Principal Component Analysis (PCA) using Python

Advertisements

Backup Database Script

USE master;
GO

DECLARE @databaseName VARCHAR(256),
		@path VARCHAR(256),
		@date VARCHAR(50),
		@disk VARCHAR(256);

SET @databaseName = 'Singapore';
SET @path = 'C:\Office\Database\backup\';

SELECT @date = CONVERT(VARCHAR(20), GETDATE(), 112);
SELECT @disk = @path + @databaseName + '_' + @date + '.bak';

BACKUP DATABASE @databaseName TO DISK = @disk

PS – Huge Log File

When the log is getting huge, it not a recommend to open into any text editor. most of text editor will not responding or crash because of log file was too big.

Get-Content "{file} | Select -First 10

Get-Content "{file} | Select-String -Pattern "{Pattern}"

PS – Register service

$destination = "D:\Publish\Snatch"
$filename = "Lottery.Automation.Services.Snatch.exe"
$servicename = "Snatch.Keno.Korea"
$description = "Snatch keno for korea market"


New-Service -Name $servicename -DisplayName $servicename -Description $description -BinaryPathName "$destination\$filename" -StartupType Automatic


# This is to delete the services
#(Get-WmiObject win32_service -filter "name='$servicename'").delete()
# This is to delete the services

VSO Prebuild

When you add vNext project in the VSO it will always encounter an err because it didn’t install the dnvm.

Run this prebuild before run other test.

# bootstrap DNVM into this session.
&{$Branch='dev';iex ((new-object net.webclient).DownloadString('https://raw.githubusercontent.com/aspnet/Home/dev/dnvminstall.ps1'))}

# load up the global.json so we can find the DNX version
$globalJson = Get-Content -Path $PSScriptRoot\global.json -Raw -ErrorAction Ignore | ConvertFrom-Json -ErrorAction Ignore

if($globalJson)
{
    $dnxVersion = $globalJson.sdk.version
}
else
{
    Write-Warning "Unable to locate global.json to determine using 'latest'"
    $dnxVersion = "latest"
}

# install DNX
# only installs the default (x86, clr) runtime of the framework.
# If you need additional architectures or runtimes you should add additional calls
# ex: & $env:USERPROFILE\.dnx\bin\dnvm install $dnxVersion -r coreclr
& $env:USERPROFILE\.dnx\bin\dnvm install $dnxVersion -Persistent

 # run DNU restore on all project.json files in the src folder including 2>1 to redirect stderr to stdout for badly behaved tools
Get-ChildItem -Path $PSScriptRoot\src -Filter project.json -Recurse | ForEach-Object { & dnu restore $_.FullName 2>1 }

Hack DDoS protection by Cloudflare

Some website try to protected their site by using DDoS protection by Cloudflare. If we look into below url: “http://www.jlotto.kr/keno.aspx?method=kenoWinNoList” 1. The cloudflare will checking for the browser information, then will redirect the page into the jlotto.kr. DDoSProtectionCloudFlare DDoSBreakCloudflare   2. After we see this, we turn on the fiddler to capture what happen, we will see that the step. 503, 302, 200 HttpStatus. DDoSBreakFiddler Then we need to investigate the more details like what information that parse from 503 -> 302 -> 200. 3. When we look into the last request 200 success we can identify, that it required “cf_clearance” to be set on the cookie. 4. Continue to trace back, the 2nd request is “http://www.jlotto.kr/cdn-cgi/l/chk_jschl?jschl_vc={0}&pass={1}&jschl_answer={2}” How the value jschl_vc, pass & jschl_answer it get the value from the first request. jschl_vs & pass we can get from the innerHtml jschl_answer we need to use javascript to calculate it.

After we apply this rule into the code we will able to crawler the official site.