Node.js - the corereadObject('demo.json', function (err, obj) { if (err) { console.error(err); } else { console.log(obj); } }); fs = require 'fs' readObject = (filePath, cb) -> fs.readFile filePath, (err, buf) err, obj readObject 'demo.json', (err, obj) -> if err console.error err else console.log obj JavaScript CoffeeScript { "name": "Mark Volkmann", "address": { "street": "644 Glen whilst, until, waterfall, queue, auto, iterator, apply, nextTick utilities memoize, unmemoize, log, dir, noConflict Written by Caolan McMahon https://github.com/caolan/async 12 Node.js Async Example0 码力 | 124 页 | 7.87 MB | 1 年前3
 FISCO BCOS 1.3 中文文档/mydata/node0/genesis.json --config /mydata/node0/config.json 验证可共识 查看日志,查看打包信息 tail -f /mydata/node0/log/* |grep +++ 等待一段时间,可看到周期性的出现如下日志,表示节点间在周期性的进行共 识,节点运行正确 INFO|2018-08-10 14:53:33:083|+++++++++++++++++++++++++++ 验证已连接 查看日志 cat /mydata/node1/log/* | grep "Connected" 看到如下日志,表示节点已经连接了其它的节点 INFO|2018-11-07 15:21:11:314|Connected to 1 peers 验证可共识 查看日志,查看打包信息 tail -f /mydata/node1/log/* |grep +++ 可看到周期性的出现如下日志, 文件。 node0 |-- genesis.json #创世块文件(创世块信息,god账号,创世节点) |-- config.json #节点总配置文件(IP,端口,共识算法) |-- log.conf #节点日志配置文件(日志格式,优先级) |-- start.sh #节点启动脚本 |-- stop.sh #节点停止脚本 |-- data | |--0 码力 | 491 页 | 5.72 MB | 1 年前3
 Scrapy 0.16 Documentationour spider. You can see a log line for each URL defined in start_urls. Because these URLs are the starting ones, they have no referrers, which is shown at the end of the log line, where it says (referer: org/Computers/Programming/Languages/Python/Books/ This is what the shell looks like: [ ... Scrapy log here ... ] [s] Available Scrapy objects: [s] 2010-08-19 21:45:59-0300 [default] INFO: Spider closed each depth level Usage example: $ scrapy parse http://www.example.com/ -c parse_item [ ... scrapy log lines crawling example.com spider ... ] >>> STATUS DEPTH LEVEL 1 <<< # Scraped Items ------------0 码力 | 203 页 | 931.99 KB | 1 年前3
 Spring Boot 2.2.8.RELEASE Reference Guide. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.4.1. Log Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.4. Log Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.4.5. Log Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.4.6. Custom Log Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .0 码力 | 523 页 | 11.05 MB | 1 年前3
 Spring Boot 2.2.7.RELEASE Reference Guide. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.4.1. Log Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.4. Log Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.4.5. Log Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.4.6. Custom Log Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .0 码力 | 522 页 | 11.01 MB | 1 年前3
 Spring Boot 2.2.5.RELEASE Reference Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.4.1. Log Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.4.4. Log Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.5. Log Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.6. Custom Log Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .0 码力 | 521 页 | 11.00 MB | 1 年前3
 Spring Boot 2.2.4.RELEASE Reference Guide. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.4.1. Log Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.4.4. Log Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.5. Log Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.6. Custom Log Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .0 码力 | 521 页 | 11.00 MB | 1 年前3
 Spring Boot 2.2.0.RC1 Reference Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.4.1. Log Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 4.4.4. Log Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.4.5. Log Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 4.4.6. Custom Log Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .0 码力 | 518 页 | 10.93 MB | 1 年前3
 Scrapy 0.18 Documentationour spider. You can see a log line for each URL defined in start_urls. Because these URLs are the starting ones, they have no referrers, which is shown at the end of the log line, where it says (referer: org/Computers/Programming/Languages/Python/Books/ This is what the shell looks like: [ ... Scrapy log here ... ] [s] Available Scrapy objects: [s] 2010-08-19 21:45:59-0300 [default] INFO: Spider closed each depth level Usage example: $ scrapy parse http://www.example.com/ -c parse_item [ ... scrapy log lines crawling example.com spider ... ] >>> STATUS DEPTH LEVEL 1 <<< # Scraped Items ------------0 码力 | 201 页 | 929.55 KB | 1 年前3
 Scrapy 0.16 Documentationour spider. You can see a log line for each URL defined in start_urls. Because these URLs are the starting ones, they have no referrers, which is shown at the end of the log line, where it says (referer: org/Computers/Programming/Languages/Python/Books/ This is what the shell looks like: [ ... Scrapy log here ... ] [s] Available Scrapy objects: [s] 2010-08-19 21:45:59-0300 [default] INFO: Spider closed each depth level Usage example: $ scrapy parse http://www.example.com/ -c parse_item [ ... scrapy log lines crawling example.com spider ... ] >>> STATUS DEPTH LEVEL 1 <<< # Scraped Items -----------0 码力 | 272 页 | 522.10 KB | 1 年前3
共 1000 条
- 1
 - 2
 - 3
 - 4
 - 5
 - 6
 - 100
 













