Script: Check login Zingme Using Mechanize

Đầu tiên ta thêm thư viện mechanized cho python:

mechanize-0.2.5.tar.gz

Sau khi download xong ta thực hiện cài đặt:

1
2
3
user@CPU10577:~/Desktop$ tar -zxvf mechanize-0.2.5.tar.gz
user@CPU10577:~/Desktop$ cd mechanize-0.2.5/
user@CPU10577:~/Desktop/mechanize-0.2.5$ sudo python setup.py

Script check login zingme:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
#! /usr/bin/python
import mechanize
import sys
username = "username"
= "password"
browser = mechanize.Browser()
cookies = mechanize.LWPCookieJar()
browser.set_cookiejar(cookies)
##
browser.set_handle_equiv(True)
browser.set_handle_referer(True)
browser.set_handle_redirect(True)
browser.set_handle_robots(False)
browser.set_handle_refresh(mechanize._http.HTTPRefreshProcessor(), max_time=1)
browser.addheaders = [('user-agent', '   Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.3) Gecko/20100423 /10.04 (lucid) Firefox/3.6.3')]
try:
    browser.open(login_url)
    browser.select_form("frmLogin")
    browser.form["u"] = username
    browser.form["p"] = password
    response = browser.submit()
    if "LOGIN_SUCCESSFULLY" in str(cookies):
        print "Login Successful!!!!"
    else:
        print "Login Failed!!!"
except Exception,ex:
    print "Login Failed:",ex
    sys.exit(2)
Script: Get Memcache stats bằng telnetlib
1
2
3
4
5
6
7
8
9
10
11
12
13
#! /usr/bin/python
import telnetlib
host="10.30.56.102"
port="11211"
tn=telnetlib.Telnet(host,port)
print "CONNECTING.........."
tn.read_very_eager()  
tn.write('stats' + '\n')
tn.write('quit\n')
ret = tn.read_until('END')
tn.close()
print ret

Kết quả:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
CONNECTING..........
STAT pid 7987
STAT uptime 9851595
STAT time 1417681131
STAT version 1.4.4
STAT pointer_size 64
STAT rusage_user 271.242764
STAT rusage_system 211.859792
STAT curr_connections 299
STAT total_connections 70171
STAT connection_structures 300
STAT cmd_get 30720
STAT cmd_set 11231
STAT cmd_flush 27
STAT get_hits 22536
STAT get_misses 8184
STAT delete_misses 70
STAT delete_hits 92
STAT incr_misses 0
STAT incr_hits 0
STAT decr_misses 0
STAT decr_hits 0
STAT cas_misses 0
STAT cas_hits 163
STAT cas_badval 0
STAT auth_cmds 0
STAT auth_errors 0
STAT bytes_read 27839713
STAT bytes_written 2303266501
STAT limit_maxbytes 1073741824
STAT accepting_conns 1
STAT listen_disabled_num 0
STAT threads 4
STAT conn_yields 0
STAT bytes 7411835
STAT curr_items 151
STAT total_items 11231
STAT evictions 0
END

Trong đó:

Field Sample value Description
accepting_conns 1 The is currently accepting new connections.
auth_cmds 0 Number of authentication commands processed by the server – if you use authentication within your installation. The default is IP (routing) level which speeds up the actual Memcached usage by removing the authentication requirement.
auth_errors 0 Number of failed authentication tries of clients.
bytes 7411835 Number of bytes currently used for caching items, this server currently uses ~6 MB of it’s maximum allowed (limit_maxbytes) 1 GB size.
bytes_read 27839713 Total number of bytes received from the network by this server.
bytes_written 2303266501 Total number of bytes send to the network by this server.
cas_badval 0 The “cas” command is some kind of Memcached’s way to avoid locking. “cas” calls with bad identifier are counted in this stats key.
cas_hits 163 Number of successful “cas” commands.
cas_misses 0 “cas” calls fail if the value has been changed since it was requested from the cache. We’re currently not using “cas” at all, so all three cas values are zero.
cmd_flush 27 The “flush_all” command clears the whole cache and shouldn’t be used during normal operation.
cmd_get 30720 Number of “get” commands received since server startup not counting if they were successful or not.
cmd_set 11231 Number of “set” commands serviced since startup.
connection_structures 300 Number of internal connection handles currently held by the server. May be used as some kind of “maximum parallel connection count” but the server may destroy connection structures (don’t know if he ever does) or prepare some without having actual connections for them (also don’t know if he does). 42 maximum connections and 34 current connections (curr_connections) sounds reasonable, the live servers also have about 10% more connection_structures than curr_connections.
conn_yields 0 Memcached has a configurable maximum number of requests per event (-R command line argument), this counter shows the number of times any client hit this limit.
curr_connections 299 Number of open connections to this Memcached server, should be the same value on all servers during normal operation. This is something like the count of ’s “SHOW PROCESSLIST” result rows.
curr_items 151 Number of items currently in this server’s cache. The production system of this development environment holds more than 8 million items.
decr_hits 0 The “decr” command decreases a stored (integer) value by 1. A “hit” is a “decr” call to an existing key.
decr_misses 0 “decr” command calls to undefined keys.
delete_hits 92 Stored keys may be deleted using the “delete” command, this system doesn’t delete cached data itself, but it’s using the Memcached to avoid recaching-races and the race keys are deleted once the race is over and fresh content has been cached.
delete_misses 70 Number of “delete” commands for keys not existing within the cache. These 107k failed deletes are deletions of non existent race keys (see above).
evictions 0 Number of objects removed from the cache to free up memory for new items because Memcached reached it’s maximum memory setting (limit_maxbytes).
get_hits 22536 Number of successful “get” commands (cache hits) since startup, divide them by the “cmd_get” value to get the cache hitrate: This server was able to serve 24% of it’s get requests from the cache, the live servers of this installation usually have more than 98% hits.
get_misses 8184 Number of failed “get” requests because nothing was cached for this key or the cached value was too old.
incr_hits 0 Number of successful “incr” commands processed. “incr” is a replace adding 1 to the stored value and failing if no value is stored. This specific installation (currently) doesn’t use incr/decr commands, so all their values are zero.
incr_misses 0 Number of failed “incr” commands (see incr_hits).
limit_maxbytes 1073741824 Maximum configured cache size (set on the command line while starting the memcached server), look at the “bytes” value for the actual usage.
listen_disabled_num 0 Number of denied connection attempts because memcached reached it’s configured connection limit (“-c” command line argument).
pid 7987 Current process ID of the Memcached task.
pointer_size 64 Number of bits of the hostsystem, may show “32” instead of “64” if the running Memcached binary was compiled for 32 bit environments and is running on a 64 bit system.
reclaimed 14740 Numer of times a write command to the cached used memory from another expired key. These are not storage operations deleting old items due to a full cache.
rusage_system 211.859792 Number of system time seconds for this server process.
rusage_user 271.242764 Numer of user time seconds for this server process.
threads 4 Number of threads used by the current Memcached server process.
time 1417681131 Current timestamp of the Memcached’s server.
total_connections 70171 Numer of successful connect attempts to this server since it has been started. Roughly $number_of_connections_per_task * $number_of_webserver_tasks * $number_of_webserver_restarts.
total_items 11231 Numer of items stored ever stored on this server. This is no “maximum item count” value but a counted increased by every new item stored in the cache.
uptime 9851595 Numer of seconds the Memcached server has been running since last restart.1145873 / (60 * 60 * 24) = ~114 days since this server has been restarted
version 1.4.4 Version number of the server
Script: Automatically gen data for Kafka
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
from kafka import SimpleProducer, KafkaClient
import random
import time
######## Initial ########
host = "192.168.1.100"
port = 9092
topic = "logstash"
kafka_server = "%s:%d"% (host,port)
browsers = {'1': 'Firefox', '2': 'Chrome', '3': 'Safari', '4': 'Opera', '5': 'IE'}
behavior = {'1': 'Listen', '2': 'Download', '3': 'Like', '4': 'Share', '5': 'Post'}
count = 0
######## Connect to Server ########
kafka = KafkaClient(kafka_server)
producer = SimpleProducer(kafka)
######## Generate data
#for i in range (0,20):
while True:
  rd = str(random.randint(1,5))
  bs = browsers[rd]
  rd = str(random.randint(1,5))
  bh = behavior[rd]
  ip = ".".join(map(str, (random.randint(0, 255) for _ in range(4))))
  date = time.strftime("%d/%m/%Y-%H:%M:%S")
  rate = random.randint(900,1000)
  message = '%s %s %s %s'% (date,ip,bh,bs)
  producer.send_messages(topic, message)
  count = count + 1
  print 'Message ',count, ' send'
  time.sleep(rate/1000)

Multiple Thread

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
from kafka import SimpleProducer, KafkaClient
import random
import time
import threading
######## Initial ########
host = "192.168.1.100"
port = 9092
topic = "logstash"
kafka_server = "%s:%d"% (host,port)
browsers = {'1': 'Firefox', '2': 'Chrome', '3': 'Safari', '4': 'Opera', '5': 'IE'}
behavior = {'1': 'Listen', '2': 'Download', '3': 'Like', '4': 'Share', '5': 'Post'}
numworker = 10
#####
def kafka_worker():
  kafka = KafkaClient(kafka_server)
  producer = SimpleProducer(kafka)
  ######## Generate data
  while True:
    rd = str(random.randint(1,5))
    bs = browsers[rd]
    rd = str(random.randint(1,5))
    bh = behavior[rd]
    ip = ".".join(map(str, (random.randint(0, 255) for _ in range(4))))
    date = time.strftime("%d/%m/%Y-%H:%M:%S")
    message = '%s %s %s %s'% (date,ip,bh,bs)
    producer.send_messages(topic, message)
threads = []
for i in range(numworker):
    t = threading.Thread(target=kafka_worker)
    threads.append(t)
    t.start()
Script: Automatically gen data for Scribe
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
from scribe import scribe
from thrift.transport import TTransport, TSocket
from thrift.protocol import TBinaryProtocol
import random
import time
 
def autogenscribe():
  socket = TSocket.TSocket(host="10.30.40.15", port=1463)
  transport = TTransport.TFramedTransport(socket)
  protocol = TBinaryProtocol.TBinaryProtocol(trans=transport, strictRead=False, strictWrite=False)
  client = scribe.Client(protocol)
  transport.open()
#### Init
  category = 'logstash'
  browsers = {'1': 'Firefox', '2': 'Chrome', '3': 'Safari', '4': 'Opera', '5': 'IE'}
  behavior = {'1': 'Listen', '2': 'Download', '3': 'Like', '4': 'Share', '5': 'Post'}
  i = 0
  while True:
    rd = str(random.randint(1,5))
    bs = browsers[rd]
    rd = str(random.randint(1,5))
    bh = behavior[rd]
    ip = ".".join(map(str, (random.randint(0, 255) for _ in range(4))))
    date = time.strftime("%d/%m/%Y-%H:%M:%S")
    message = '%s %s %s %s'% (date,ip,bh,bs)
    log_entry = scribe.LogEntry(category, message)
    result = client.Log(messages=[log_entry])
    if result == 0:
      i = i + 1
      print i, ': success'
autogenscribe()

Multiple Thread

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
from scribe import scribe
from thrift.transport import TTransport, TSocket
from thrift.protocol import TBinaryProtocol
import threading
import random
import time
 
numworker = 10
def autogenscribe():
  socket = TSocket.TSocket(host="10.30.40.15", port=1463)
  transport = TTransport.TFramedTransport(socket)
  protocol = TBinaryProtocol.TBinaryProtocol(trans=transport, strictRead=False, strictWrite=False)
  client = scribe.Client(protocol)
  transport.open()
#### Init
  category = 'logstash'
  browsers = {'1': 'Firefox', '2': 'Chrome', '3': 'Safari', '4': 'Opera', '5': 'IE'}
  behavior = {'1': 'Listen', '2': 'Download', '3': 'Like', '4': 'Share', '5': 'Post'}
  while True:
    rd = str(random.randint(1,5))
    bs = browsers[rd]
    rd = str(random.randint(1,5))
    bh = behavior[rd]
    ip = ".".join(map(str, (random.randint(0, 255) for _ in range(4))))
    date = time.strftime("%d/%m/%Y-%H:%M:%S")
    message = '%s %s %s %s'% (date,ip,bh,bs)
    log_entry = scribe.LogEntry(category, message)
    result = client.Log(messages=[log_entry])
threads = []
for i in range(numworker):
    t = threading.Thread(target=autogenscribe)
    threads.append(t)
    t.start()
Print Friendly

Comments

comments

Bài viết liên quan