hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3bc408af49d252cf6d17ac431aaf8740f1a34c5c | 699 | md | Markdown | readme.md | shniu/exchange | bbd67961eaeac6bc9fa8aaec1154cd450a572976 | [
"MIT"
] | null | null | null | readme.md | shniu/exchange | bbd67961eaeac6bc9fa8aaec1154cd450a572976 | [
"MIT"
] | null | null | null | readme.md | shniu/exchange | bbd67961eaeac6bc9fa8aaec1154cd450a572976 | [
"MIT"
] | null | null | null | # Exchange
The project is a digital asset exchange.
## Tech stack
Using Spring Cloud Alibaba.
1. Nacos as Naming Service
2. Nacos as Config Service
3. OpenFeign as RPC
4. Seata as Distribute Transaction
5. RocketMQ as Message Queue
6. Spring Cloud Gateway as API Gateway
7. Sentinel as Rate Limit etc.
8. Spring & Spring Boot as Base Framework
9. MySQL as data store
10. Redis as Cache etc.
11. Hazelcast as In-Memory-Data-Grid
12. Kafka & Flink as Streaming process
13. Websocket as push / notification protocol
## Design
1. ex-order-srv -> port:19100
2. ex-account-srv -> port:19200
3. ex-matching-srv -> port:19300
4. ex-clearing-srv -> port:19400
5. ex-quotation-srv -> port:19500
6. ...
| 21.84375 | 45 | 0.739628 | eng_Latn | 0.505628 |
3bc4376ab63e32232208ada271187b69c0387f6f | 103 | md | Markdown | README.md | Mihndim2020/google-search-result-page | 24f8f6be4602d7781478baf5a2825a518949b963 | [
"MIT"
] | 1 | 2021-02-02T17:07:48.000Z | 2021-02-02T17:07:48.000Z | README.md | Mihndim2020/google-search-result-page | 24f8f6be4602d7781478baf5a2825a518949b963 | [
"MIT"
] | null | null | null | README.md | Mihndim2020/google-search-result-page | 24f8f6be4602d7781478baf5a2825a518949b963 | [
"MIT"
] | null | null | null | # google-search-result-page
This is a clone of the google search result page only for practice purpose
| 34.333333 | 74 | 0.805825 | eng_Latn | 0.996696 |
3bc577a5642045f78447309599e6dca5f3675dd1 | 229 | md | Markdown | AndrewDavidoffDocset0426213909/articles/index.md | AndrewDavidoff/AndrewDavidoffRepo0426213909 | 4ee7b3dbc81782000a5158731fcfdf6aeef7d397 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AndrewDavidoffDocset0426213909/articles/index.md | AndrewDavidoff/AndrewDavidoffRepo0426213909 | 4ee7b3dbc81782000a5158731fcfdf6aeef7d397 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AndrewDavidoffDocset0426213909/articles/index.md | AndrewDavidoff/AndrewDavidoffRepo0426213909 | 4ee7b3dbc81782000a5158731fcfdf6aeef7d397 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | # GenDox HTML Test
This is the root of test collection of GenDox documents converted to be published on Microsoft Docs. Documents are generated by DocSetPrep tool prototype; the higher level navigation structure created manually | 76.333333 | 209 | 0.834061 | eng_Latn | 0.997901 |
3bc5b5cafa4f23f751ff3e4af2281267de37ede6 | 1,828 | md | Markdown | _posts/2021-05-03-first-post.md | basselsaleh/blog | 24323cc8503cbfbd803be2fa6e9cee64bd63da19 | [
"Apache-2.0"
] | null | null | null | _posts/2021-05-03-first-post.md | basselsaleh/blog | 24323cc8503cbfbd803be2fa6e9cee64bd63da19 | [
"Apache-2.0"
] | null | null | null | _posts/2021-05-03-first-post.md | basselsaleh/blog | 24323cc8503cbfbd803be2fa6e9cee64bd63da19 | [
"Apache-2.0"
] | null | null | null | ---
toc: true
layout: post
description: "My first post, just a test"
categories: [markdown]
title: "Hello World!"
---
# Hello World!
So here it is, my first blog post. More likely than not this particular post will have a short life, and I'll delete it once I create my first **hIgH quAliTy** blog post. I have used markdown a little bit before, mostly for making hastily thrown together `README` files for repos that I never made public anyway. It seems pretty intuitive to use so far. Let's test some features.
## Lists
Big fan of itemization. Thankfully with markdown this is easy to do and make pretty:
- For example, this is the first item of a list.
- This is the second.
## Code
I don't imagine I'll be integrating code *too* often, but perhaps I'll make some simple tutorials in the future for libraries I like/workflows I find useful.
Here's some `Python` code:
```python
def funky(x):
return str(x) + 'is a funky monkey'
# will a comment show up?
print(funky(6))
```
Does `Julia` syntax work?
```julia
function julia_funk(x)
x = "funkalicious"
end
# julia comment
x = julia_funk(6)
```
## Math
So Jekyll uses a markdown parser called `kramdown`, which supports MathJax. In theory I can render \\(\LaTeX\\) stuff like so:
\\[ -\frac{\hbar^2}{2m}\nabla^2\Psi = i\hbar\frac{\partial}{\partial t}\Psi \\]
## Pictures
Okay final test for now. Let's insert a big picture of my face. And then figure out how to make it less big later.
![]({{ site.baseurl }}/images/headshot.jpg "my face")
Turns out it made the picture the perfect size. This is all working so well so far!
I'm currently building the site locally and viewing it in the browser using a Jekyll server inside a docker thing... I just followed some tutorials and set it up. Once I push the changes this should be reflected on the live site. | 30.983051 | 379 | 0.727024 | eng_Latn | 0.996392 |
3bc6bbdd47916887c3733a4ed04459c7e7897daa | 670 | md | Markdown | wiki/translations/id/Category_Roadmap.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/translations/id/Category_Roadmap.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/translations/id/Category_Roadmap.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | # Category:Roadmap/id
This category gathers all articles documenting what is planned in the Roadmap of FreeCAD.
### Contents:
------------------------------------------------------- -------------------------------------------------------------
[Assembly project/id](Assembly_project/id.md) [Development roadmap/id](Development_roadmap/id.md)
------------------------------------------------------- -------------------------------------------------------------
[<img src="images/Property.png" style="width:16px"> Categories/id](<img src="images/Property.png" style="width:16px"> Categories/id.md)
---
[documentation index](../README.md) > Category:Roadmap/id
| 47.857143 | 135 | 0.48209 | yue_Hant | 0.346592 |
3bc8cec599ace9179580e756713da59684358b31 | 4,103 | md | Markdown | _posts/2021-03-23-software-engineer-in-a-telecommunication-company.md | cagta/thinkspace | 620949741da6c8a385f1d422d61bd7dc68402e90 | [
"MIT"
] | null | null | null | _posts/2021-03-23-software-engineer-in-a-telecommunication-company.md | cagta/thinkspace | 620949741da6c8a385f1d422d61bd7dc68402e90 | [
"MIT"
] | null | null | null | _posts/2021-03-23-software-engineer-in-a-telecommunication-company.md | cagta/thinkspace | 620949741da6c8a385f1d422d61bd7dc68402e90 | [
"MIT"
] | null | null | null | ---
layout: post
title: software engineer in a telecommunication company
---
this article originally published on [medium](https://cagta.medium.com/software-engineer-within-a-telecommunication-company-7af992e2c68b).
Hi everybody. This is Çağatay, a software engineer, working for a telecommunication company. You are about to read my journey within this company as a software engineer. I’ll share my insights about the domain itself, my best practices, etc.
I would like to talk about different aspects of my daily life. So I’m planning to publish more than one thematic blog post. The main purpose of this series is to give you a brief knowledge about the telecommunication industry as well as showing the impact of my work as a software engineer.
I’m a member of the Network Infrastructure business group in this company. Most of my customers are network engineers who are supporting the design, testing, and migration operations of ISPs and/or GSM operators. We can group all these operations under two categories: **greenfield deployments** and **migrations**.
Greenfield deployment means there is no prior or existing network infrastructure to consider. Building a new data center is a good example of this kind of operation. On the other hand, if you are switching existing infrastructure this means you are migrating.
Both of these operations are highly costly in terms of resources and time. Most of the time, the companies constantly observe their infrastructure needs and acts according to their 5 to 10-year projections. They want to have scalable and cost-effective solutions.
But sometimes, the unexpected comes. Customers’ demands made peaks and they have to respond fast to keep these customers. Otherwise, competitors may get involved. Or one of the tiniest and deadliest creatures of the world locks us all inside our house and makes the ISPs projections are meaningless.
In these kinds of tough situations, time is way important than before. That’s where we, software engineers, come into the scene. I’m a member of the automation team within my organization. We are developing tools to assist network engineers and shrink the deployment time. From document preparation to config generation, there are too many manual processes that can be automated.
However, we need to act wisely. We need to understand the most impactful changes and react according to these. We can not win the battle on all fronts. Like every other team in the organization, we also have limited time and resources.
After this brief introduction, I would like to give an idea of what we are gonna talk in the further blog posts. From now on I’ll use the abbreviation of this post’s title(SEIATC) as a pointer for my future posts to show they are belongs to this series.
**SEIATC — How I’m using Python ?** I would like to give the brief knowledge about Python itself. Then I would like to talk about the libraries that makes my life easier. Lastly, I want to talk about a bit of best practices in Python.
**SEIATC — How I’m using Git ?** Probably, this will be the most generic topic in the list. However I’m a big fan of Git and I think it’s very powerful but sometimes I forgot to use it as a leverage. I’ll share my tips and tricks in this post.
**SEIATC — Should I Learn The Network ?** In this article I would like to talk about domain knowledge. As a software engineer what kind of domain knowledge makes my life better ? Should I spend my time to gain more ?
**SEIATC — To Debt Or Not To Debt, Different Aspects Of Technical Debt** In the last article of this series I would like to talk about technical debt. And how I’m trying to make this as small as possible?
I hope all of these articles will made you more aware about the path that I’m walking. My goal is not to praise myself. I’m trying to show there is another way to use your software engineering skills.
I do not believe in rights and wrongs in professional life. If you are willing to pay the price, you can walk any path you want. There is nothing more satisfying than to know that you are on your way just because you choose it. | 117.228571 | 379 | 0.791372 | eng_Latn | 0.999869 |
3bc92fefefd93ba48985c03db99537ccc25aeecf | 275 | markdown | Markdown | _posts/tools/2019-01-01-ideatify.markdown | thomas-ge/Games-Portfolio | 409749468d5aab93fb69b2b32b6bcd4316fc708d | [
"MIT"
] | null | null | null | _posts/tools/2019-01-01-ideatify.markdown | thomas-ge/Games-Portfolio | 409749468d5aab93fb69b2b32b6bcd4316fc708d | [
"MIT"
] | null | null | null | _posts/tools/2019-01-01-ideatify.markdown | thomas-ge/Games-Portfolio | 409749468d5aab93fb69b2b32b6bcd4316fc708d | [
"MIT"
] | null | null | null | ---
layout: post
title: "Ideatify"
category: Tools
date: 2018-12-01 12:00:00
excerpt: "Get your game development rollin' with this idea generator."
image:
feature: ideatify.jpg
topPosition: 0px
syntaxHighlighter: no
link: https://feierabend-crew.com/tools/ideatify
--- | 22.916667 | 70 | 0.745455 | eng_Latn | 0.584045 |
3bc9a78e82a81a169c23addceb51bd7fcc3831b1 | 21 | md | Markdown | README.md | alexfiftysix/grad-cloud-workshop | 27d550c983a59270716a30fe03881dd1fd215a89 | [
"CC-BY-3.0"
] | null | null | null | README.md | alexfiftysix/grad-cloud-workshop | 27d550c983a59270716a30fe03881dd1fd215a89 | [
"CC-BY-3.0"
] | null | null | null | README.md | alexfiftysix/grad-cloud-workshop | 27d550c983a59270716a30fe03881dd1fd215a89 | [
"CC-BY-3.0"
] | null | null | null | # grad-cloud-workshop | 21 | 21 | 0.809524 | eng_Latn | 0.502881 |
3bc9f009ee3508806dea7a6815662337c80ffc1d | 44 | md | Markdown | README.md | ZachtimusPrime/CatFacts-Android | 56b8e0b913a9231757a69bc923903f2b9a35cbce | [
"MIT"
] | null | null | null | README.md | ZachtimusPrime/CatFacts-Android | 56b8e0b913a9231757a69bc923903f2b9a35cbce | [
"MIT"
] | null | null | null | README.md | ZachtimusPrime/CatFacts-Android | 56b8e0b913a9231757a69bc923903f2b9a35cbce | [
"MIT"
] | null | null | null | # CatFacts-Android
An app to get cat facts.
| 14.666667 | 24 | 0.75 | eng_Latn | 0.554285 |
3bc9f575c98f66be5b8a79490387241c0ab56ad5 | 3,974 | md | Markdown | docs/Error Handling.md | Celarix/Cix | 46d8c1ecff5b478686d3854951b1fe518ed15192 | [
"MIT"
] | null | null | null | docs/Error Handling.md | Celarix/Cix | 46d8c1ecff5b478686d3854951b1fe518ed15192 | [
"MIT"
] | 1 | 2019-05-31T19:25:10.000Z | 2019-05-31T19:25:10.000Z | docs/Error Handling.md | Celarix/Cix | 46d8c1ecff5b478686d3854951b1fe518ed15192 | [
"MIT"
] | null | null | null | # Error Handling
Compilers typically scan through code, noting any errors and warnings it comes across, instead of failing as soon as it finds the first error. This helps users avoid a long and painful fix-and-compile cycle.
Errors are invalid syntatic or semantic constructs that cannot be correctly parsed or understood by the compiler. Warnings are syntatically and semantically valid code that may not be what the user expects.
Errors and warnings are composed of the following elements:
* Error Code: A two-character prefix to indicate which stage the error came from, followed by a three-digit number to indicate the error code.
* Message: A message including what the error is, what specifically is wrong, and, optionally, what a valid example looks like.
* File, line, and character position: The position of the first character of the error.
In the Cix compiler, errors and warnings populate an error list. This list is returned at the completion of the compilation process. Errors can come from any stage of compilation, and later stages rely on earlier stages to be correct and complete before continuing. Thus, all errors from a single stage must be placed in the error list instead of failing on the first error. Once the stage completes with errors, the compilation can fail entirely before reaching a later stage.
## Continue Processing with Errors on Each Stage
Each stage of compilation can encouter errors any time during the stage. Each stage also requires differing methods of continuing the compilation process after finding an error. Below are the consideration for each stage:
### I/O Stage
No special considerations.
### Finding String Literals
...
### Preprocessor
Preprocessing is performed line-by-line. Thus, when an error is encountered, skip to the next line and continue the preprocessing from there.
### Comment Stage
Certain comment-like constructs, like single forward slashes at the end of a line, are not valid.
### Type Rewriting
...
### ANTLR4 Parsing
Cix uses an ANTLR4 grammar and related generated classes to parse a preprocessed Cix file. The errors that the ANTLR types report are recorded here.
### AST Generation
After ANTLR parses the preprocessed Cix file, Cix then converts ANTLR's parse tree classes into the AST types.
## Error List
### I/O Errors
* IO001: Null or empty file path.
* IO002: File {file} is not a valid path or the file does not exist.
* IO003: File {file} could not be opened for reading. Check permissions or if the file is already opened.
* IO004: File {file} is blank or empty.
* IO005: I/O exception occurred. (include exception details)
### Finding String Literals
* SL001: File ends with unterminated string literal.
* SL002: Escaped double quote found outside string literal.
## Comment Remover Errors
* CR001: Single forward slash at end of line is not a valid comment.
### Preprocessor Errors
* PR001: "{defineLine}" isn't valid; must be "#define SYMBOL" or "#define THIS THAT".
* PR002: #define symbol {symbol} is not an identifier.
* PR003: Symbol {symbol} is already defined.
* PR004: "{substitution}" for symbol {symbol} is not valid; must be a single word or an integer.
* PR005: "{undefineLine}" isn't valid; must be "#undefine SYMBOL"
* PR006: Cannot undefine {symbol} as it was not previously defined.
* PR007: "{ifdefLine}" isn't valid; must be "#ifdef SYMBOL"
* PR008: "{ifndefLine}" isn't valid; must be "#ifndef SYMBOL"
* PR009: "{includeLine}" isn't valid; must be "#include "file"" or "#include <file>"
* PR010: The include file {file} doesn't exist or has an invalid path.
* PR011: The include file {file} couldn't be opened for reading.
* PR012: The include file {file} already includes previously included file {file}.
* PR013: An #else was found without a matching #ifdef or #ifndef.
* PR014: An #endif was found without a matching #ifdef or #ifndef.
* PR015: File "file.cix" was already included.
### ANTLR4 Parsing Errors
* PA001: Syntax error: {message} | 50.303797 | 477 | 0.763714 | eng_Latn | 0.999291 |
3bca821dbe925f82e9100549ffa71f29f5e09037 | 31,048 | md | Markdown | pages/toolkit/case-studies/wave-energy-prize.md | mhotch24/challenges-and-prizes | 75a3d40a8e2a4e527abb62f2bad1859336d6cdd2 | [
"CC0-1.0"
] | 1 | 2020-12-26T19:03:44.000Z | 2020-12-26T19:03:44.000Z | pages/toolkit/case-studies/wave-energy-prize.md | mhotch24/challenges-and-prizes | 75a3d40a8e2a4e527abb62f2bad1859336d6cdd2 | [
"CC0-1.0"
] | null | null | null | pages/toolkit/case-studies/wave-energy-prize.md | mhotch24/challenges-and-prizes | 75a3d40a8e2a4e527abb62f2bad1859336d6cdd2 | [
"CC0-1.0"
] | 1 | 2020-12-04T15:53:50.000Z | 2020-12-04T15:53:50.000Z | ---
permalink: /toolkit/case-studies/wave-energy-prize/
layout: toolkit
title: Case Study - Wave Energy Prize
---
<!--// OPEN #page-wrap //-->
<div id="page-wrap">
<div class="inner-page-wrap has-no-sidebar portfolio-type-standard row clearfix">
<!-- OPEN article -->
<article
class="portfolio-article col-sm-12 clearfix post-8443 portfolio type-portfolio status-publish has-post-thumbnail hentry portfolio-category-technology portfolio-category-2-7 portfolio-category-2-9 portfolio-category-3-1 portfolio-category-3-2 portfolio-category-4-2"
id="8443" itemscope="" itemtype="http://schema.org/CreativeWork">
<div class="portfolio-item-content">
<div class="container port-detail-media-container"><!-- OPEN .container -->
<figure class="media-wrap col-sm-12">
</figure>
</div><!-- CLOSE .container -->
<div class="grid-container padding-bottom-5">
<section class="article-body-wrap col-sm-9">
<section class="portfolio-detail-description">
<div class="body-text clearfix" itemprop="description">
<section class="container">
<div class="row">
<div class="spb_content_element col-sm-12 spb_text_column">
<div class="spb_wrapper clearfix">
<h1>Wave Energy Prize</h1>
<h3 style="border-bottom: 1px solid #e4e4e4;" class="spb-heading spb-text-heading"><span>Department of Energy (DOE)</span>
</h3>
<h2>Summary</h2>
<p><u>Description</u></p>
<p>The Wave Energy Prize is an 18-month design-build-test prize
competition that aims to:</p>
<ul>
<li>spur game-changing performance enhancements to wave
energy converters (WECs);
</li>
<li>provide a pathway to sweeping cost reductions;</li>
<li>mobilize new and existing talent;</li>
<li>provide an opportunity for apples-to-apples WEC testing
and evaluation;
</li>
<li>increase the visibility and attract potential investors;
and
</li>
<li>successfully enable the top performers to become viable
and competitive industry members.
</li>
</ul>
<p>The Wave Energy Prize will double the state-of-the-art
performance of WECs, specifically the energy captured per
unit structural cost of these devices.</p>
<p></p>
<div id="attachment_8450" style="max-width: 440px"
class="wp-caption alignleft"><a
href="{{ site.baseurl }}/assets/images/toolkit/case-studies/Wave-energy-pool.jpg"><img
class="wp-image-8450 size-full"
src="{{ site.baseurl }}/assets/images/toolkit/case-studies/Wave-energy-pool.jpg"
alt="A large wave pool called the maneuvering and seakeeping basin is depicted."
width="430" height="243"></a>
<p class="wp-caption-text">The Maneuvering and Seakeeping
Basin at the Naval Surface Warfare Center Carderock
Division in Carderock, Md., where the final round of
testing of 1/20th scale wave energy converters was held.
(Photo courtesy of NSWC Carderock Division)</p></div>
<p>With more than 50 percent of the U.S. population living
within 50 miles of coastlines, there is vast potential to
provide clean, renewable electricity to communities and
cities in the United States using wave energy. It is
estimated that the technically recoverable wave energy
resource is approximately 900-1,200 terawatt hours (TWh) per
year. Developing just a small fraction of the available wave
energy resource could allow for millions of American homes
to be powered with this clean, reliable form of energy. For
context, approximately 90,000 homes could be powered by 1
TWh per year. Extracting just 5 percent of the technical
resource potential could result in wave energy powering 5
million American homes.</p>
<p><u>Engagement</u></p>
<p>The Wave Energy Prize has successfully mobilized both new and
existing talent through the challenge, with engineers,
developers and builders from across the country having
thrown their hats in the ring. Participants include people
who represent universities, small companies, more
established players in wave energy and independent
collaborations. Also, through the <a
href="http://waveenergyprize.org/marketplace"><em>Marketplace</em></a>,
the competition has provided an online forum for external
interested parties to collaborate with participating teams.
</p>
<p><u>Detailed prize structure</u></p>
<p>The Wave Energy Prize is divided into three phases (design,
build, and test) separated by four technology gates as shown
and described below:</p>
<p><strong><em>Design</em></strong>: For the first part of the
design phase, participants were required to submit detailed
<div class="alignright" style="padding-bottom: 10px;">
<a href="{{ site.baseurl }}/assets/images/toolkit/case-studies/Wave-energy-prize-funnel.jpg"><img
class="size-medium wp-image-8448 alignright"
src="{{ site.baseurl }}/assets/images/toolkit/case-studies/Wave-energy-prize-funnel-300x146.jpg"
alt="A chart depicts the structure of the Wave Energy Prize, detailing the timeline from concept and design to test and award."
sizes="(max-width: 300px) 100vw, 300px" width="300"
height="146"></a></div>
technical submissions describing
their WEC concepts. The judging panel evaluated these
submissions according to the Technology Performance Level
rubric developed by the National Renewable Energy Laboratory
and selected up to 20 qualified teams. These teams were then
tasked with building 1/50th scale prototypes of their WEC
concepts, numerically modeling their performance and
developing detailed build plans for 1/20th scale prototypes.
Qualified teams were required to test their 1/50th scale
prototypes in 31 different sea states at one of five
small-scale testing facilities (University of Iowa,
University of Maine, University of Michigan, Stevens
Institute of Technology and Oregon State University) across
the country. The judging panel then evaluated device
performance, numerical modeling results and build plans to
select nine finalist and two alternate teams. The
judging panel then selected up to ten finalists and two
alternates from the qualified teams.</p>
<p><strong><em>Build</em></strong>: Finalist and alternate teams
were tasked with building 1/20th scale prototypes of their
WEC concepts. Finalists were given up to $125,000 and
alternates $25,000 to build these prototypes, with
alternates becoming eligible for up to $125,000 were they to
become finalists. Each team was paired with a data analyst
from either the National Renewable Energy Laboratory or
Sandia National Laboratories. Finalists and alternates
worked with engineers at the Naval Surface Warfare Center
Carderock Division and their assigned data analysts to come
up with comprehensive test and evaluation plans to ensure a
successful testing campaign at the Carderock Maneuvering and
Seakeeping (MASK) Basin. The judging panel then selected
nine finalist teams to proceed to testing at the MASK Basin
starting August 2016.</p>
<p><strong><em>Test</em></strong>: Each finalist team has been
given one week on site at Carderock to prepare for testing
and then one week of testing time in the MASK Basin. The
test is to determine whether their 1/20th scale devices are
double the state-of-the-art performance of WECs and thus
eligible to win the grand prize. If a team becomes eligible
to win the grand prize, their WEC device's performance will
be further evaluated to account for other important energy
capture, reliability and survivability metrics using proxy
measurements collected during MASK Basin testing. The
testing program will provide the Department of Energy (DOE)
and investors with apples-to-apples comparisons of WEC
device techno-economic performance when operating in real
ocean conditions.</p>
<p><u>Administrators and partners</u></p>
<p>DOE's Water Power Program, along with a contracted prize
administration team comprised of Ricardo, Inc., JZ
Consulting, and Polaris Strategic Communications; technical
experts from Sandia National Laboratories and the National
Renewable Energy Laboratory; and staff at the Naval Surface
Warfare Center Carderock Division are responsible for
implementing the prize design, build, and test phases.</p>
<p>DOE also has partnered with various branches of the
Department of the Navy to successfully execute the
challenge. The Office of Naval Research has provided funds
to develop the technologies and capabilities required to
ensure fair and rigorous testing in the MASK Basin; the
Naval Surface Warfare Center has provided in-kind support
for the Judging Panel and reduced facility costs; and the
Assistant Secretary of the Navy for Energy, Installations,
and Environment has provided support to test three of the
finalists in the MASK Basin.</p>
<p>The prize team also created a two-person independent expert
review panel of prize and challenge experts—one from the
White House Office of Science and Technology Policy and one
formerly of the Defense Advanced Research Projects
Agency—that, during go/no-go meetings for the challenge,
provided guidance to the prize team on all aspects of the
project, including testing program logistics, communications
and outreach, and event planning.</p>
<p><u>Incentives</u></p>
<p>The Wave Energy Prize has provided a thoughtful package of
incentives to attract developers to compete and allow them
the opportunity to reach their full potential and meet the
goal of doubling the state of the art performance:</p>
<ul>
<li>A total of $2.25 million is reserved for the prize purse
($1.5 million grand prize, $500,000 for second place and
$250,000 for third place)
</li>
<li>Seed funding for up to 10 finalists ($125,000 each) and
alternates ($25,000 each)
</li>
<li>Small-scale testing to all qualified teams valued at
close to $45,000 per team
</li>
<li>Testing for all finalists at the MASK Basin at Carderock
valued at $180,000-$200,000 per test
</li>
<li>The creation of a team-building platform (the <em>Marketplace</em>)
located on the competition website, where teams can
solicit expert needs, or experts can seek out teams
</li>
<li>An open source numerical software package, <a
href="https://wec-sim.github.io/WEC-Sim/">Wave
Energy Converter Simulation</a> (WEC-Sim) developed and
supported by the DOE national labs, and supporting
software provided for free by MathWorks for use by all
qualified teams choosing to use it for the duration of
the competition
</li>
<li>A team summit with opportunities to engage with
technical experts, investors, media, and government
</li>
</ul>
<h2>Results</h2>
<p>The challenge is not yet complete, but there have been
numerous successes so far in attracting new and existing
players to wave energy; having teams successfully reach
aggressive technical milestones; bringing forward
innovations across a range of WEC device types; generating
significant publicity; and building technical capacity to
test WECs at testing facilities across the country.</p>
<p>With an aggressive communications and outreach strategy, 92
teams registered for the competition, three times more than
expected. Of these, 66 turned in technical submissions,
which were evaluated by a panel of expert judges to identify
20 qualified teams. Most teams that registered were not
previously known to DOE. Seventeen of the 20 qualified teams
completed the initial small-scale testing phase, and out of
the nine finalists and two alternates, only two have
received any funding from DOE in the past.</p>
<p>Most of the teams have met the aggressive timelines for the
challenge. For example, to meet the requirements for
Technology Gate 2, the qualified teams built 1/50th scale
model devices, tested them at university facilities around
the country and conducted significant numerical modeling
studies in just four months. As of June 2016, and as can be
seen on the <a
href="http://waveenergyprize.org/teams/updates">Team
Updates webpage</a>, finalists and alternates have made
significant progress in designing, building, and testing
their 1/20th scale devices. This puts the challenge in a
great position to achieve its remaining objectives.</p>
<p>The finalists and alternates have put forward diverse WEC
designs, which include two submerged areal absorbers, four
point absorbers, two attenuators and three terminators. And
in these designs, DOE is already seeing technical
innovations in the areas of geometry, materials, power
conversion and controls. Some of these include:</p>
<ul>
<li>adaptive sea-state-to-sea-state control,</li>
<li>wave-to-wave control,</li>
<li>power absorption in multiple degrees of freedom,</li>
<li>optimized float shapes and dimensions for energy
absorption for broad bandwidth of wave frequencies,
</li>
<li>survival strategies such as submerging beneath the
surface for extreme storms,
</li>
<li>use of structures and materials that are cost effective
to manufacture, and
</li>
<li>flexible membranes that react to the wave pressure over
a broad area.
</li>
</ul>
<p>The public's awareness of wave energy is increasing because
of the teams' efforts in the challenge. In just over a year,
more than 100 news stories have featured the competition,
including channels like <em>Popular Science</em>, <em>The
Weather Channel</em> and <em>National Geographic</em>.
The website has hosted more than 23,000 visitors, and its
social media channels have logged more than a half million
impressions. This increased awareness of the potential
contribution of wave energy to the nation's renewable energy
mix will exist long after the challenge ends and will likely
set the stage for future private-sector investments and
government funding opportunities.</p>
<p>In the process of executing the testing program, the
small-scale testing facilities (University of Iowa,
University of Maine, University of Michigan, Stevens
Institute of Technology and Oregon State University) and the
Naval Surface Warfare Center Carderock Division have gained
significant technical capabilities and capacities to
rigorously test WECs.</p>
<h2>Areas of Excellence</h2>
<p><strong>Area of Excellence #1: "Identify Goal and Outcome
Metrics"</strong></p>
<p>There needs to be a step change in the levelized cost of
energy (LCOE) of wave energy to put it on a path to
commercialization as an electricity generation source in
large-scale utility markets in the next 15 to 20 years. It
is the goal of the challenge to achieve that step change: a
doubling of the state-of-the-art performance of WECs, as
measured through their energy absorbed per unit structural
cost.</p>
<p></p>
<div id="attachment_8447" style="max-width: 310px"
class="wp-caption alignleft"><a
href="{{ site.baseurl }}/assets/images/toolkit/case-studies/Wave-energy-tour.jpg"><img
class="size-medium wp-image-8447"
src="{{ site.baseurl }}/assets/images/toolkit/case-studies/Wave-energy-tour-300x200.jpg"
alt="Finalists and alternates from the Wave Energy Prize sand on a platform overlooking a wave pool."
sizes="(max-width: 300px) 100vw, 300px" width="300"
height="200"></a>
<p class="wp-caption-text">Finalists and alternates a tour
the MASK Basin. (Credit: NSWC Carderock Division)</p>
</div>
<p>LCOE is the commonly accepted metric that points to the
commercial viability of a source of energy. LCOE
allows for comparisons of the costs of electricity produced
by different means and sources such as solar, wind, fossil
and so on. But LCOE is a complicated metric, and its value
reflects many different variables, from details of capital
expenditures to various factors influencing operational
expenditures. Due to limited experience in building
and operating WECs, estimates of these cost details have a
high degree of uncertainty, and for low Technology Readiness
Levels (TRLs), LCOE may not be an appropriate metric to
judge device or system potential. In addition, it's hard to
test for a bunch of variables in a prize competition given
limited testing resources and time constraints. The prize
team sought to design a simple metric that could best
measure the relevant parameters of an early maturity yet
techno-economically viable system when operating in real
ocean conditions. Thus, the team created a simplified proxy
metric for LCOE that would target key components of LCOE to
show WEC device potential.</p>
<p>Given that wave energy converters (WECs) are early maturity
technologies, there was no standardized way of testing
diverse WECs to evaluate their performance before the
competition. Thus, the key metric of the challenge had to
satisfy the following requirements:</p>
<ul>
<li>To be able to objectively quantify the potential of high
techno-economic performance to bring about a step change
improvement over the state of the art
</li>
<li>To be able to quantify the state of the art with respect
to this fundamental performance metric
</li>
<li>To fully embrace the fair and accountable assessment of
fundamentally new WEC concepts (potentially leading to
disruptive technology innovation), innovations not
foreseeable at the onset of the competition
</li>
</ul>
<p>Based on these requirements, the technical experts at the
National Renewable Energy Laboratory and Sandia National
Laboratories worked with DOE and the prize administration
team to develop a new metric to measure the state-of-the-art
performance of WECs and the ACE (Average Climate Capture
Width per Characteristic Capital Expenditure). ACE
represents the energy captured per unit structural cost of
WECs. This is a proxy metric for LCOE. Just like LCOE is a
cost-to-benefit metric ($/kWh), ACE is a benefit-to-cost
metric that focuses on a key component that drives LCOE for
WECs, namely structural cost. The denominator of
ACE is a measure of the structural cost of the device,
evaluated based on technical drawings, materials used and
analytical load estimation on the structure device.</p>
<p>The state-of-the-art value for ACE is 1.5 meters per million
dollars (1.5m/$M). A finalist becomes eligible to win the
$1.5 million grand prize if they double ACE to 3m/$M during
the final round of testing at the MASK Basin in
Carderock.</p>
<p>The prize team still believed that ACE, while a significant
step in the right direction for evaluating the performance
of WECs, did not provide full confidence in WECs that could
meet the true challenge of performing at reasonable cost in
the open ocean. Thus, the prize team decided to evaluate
teams eligible to win the grand prize according to a metric
called Hydrodynamic Performance Quality (HPQ) which accounts
for other important energy capture, reliability and
survivability metrics using proxy measurements collected
during MASK Basin testing. The teams that surpass the 3m/$M
threshold will be ranked by their HPQ, and the team with the
highest HPQ will win the $1.5 million grand prize.</p>
<p> </p>
<p><strong>Area of Excellence #2: "Obtain Agency
Clearance"</strong></p>
<p>The DOE prize team in the Water Power Program engaged with
senior leadership from the very inception of the idea to run
a prize competition. Office leadership was briefed during
the development of the challenge, including on topics such
as goals, rules, testing program, judging process and
communications and outreach plan. General Counsel (GC)
informed the development of the rules, as well as the terms
and conditions.</p>
<p>The DOE prize team also worked with GC from the very
beginning of drafting the funding opportunity announcement
for a prize administration team. GC helped the team prepare
the Federal Register Notice in time for the announcement of
the launch of the prize competition.</p>
<p>Given that the prize administration team was selected under a
financial assistance agreement, the DOE contracting officer
has been engaged in each important phase, especially in the
go/no-go decisions where the team performed a rigorous
evaluation of the prize continuation application after the
registration period closed and after the end of the 1/50th
scale testing program. The DOE National Environmental Policy
Act (NEPA) staff, also part of the prize team, have ensured
that all actions taken by participants during the
competition meet the requirements of NEPA.</p>
<p> </p>
<p><strong>Area of Excellence #3: "Execute the
Communications Plan"</strong></p>
<p>The Wave Energy Prize has used several approaches to
successfully publicize the prize, mobilize potential
participants and create a strong following for the
competition, including the following:</p>
<ul>
<li>Emailing previous applicants to and inquirers of Water
Power Program funding opportunities
</li>
<li>Creating a website that allows for streamlined updates,
as well as a back end that participants can use to
engage with the prize administration team and DOE
</li>
<li>Creating the <em>Marketplace</em>, a skill-sharing
platform that promoted team-building, on the website
</li>
<li>Creating and disseminating a monthly newsletter that
answered frequently asked questions and presented
technical details in simple language, along with
communicating what DOE is striving to achieve
</li>
<li>Creating a strong social media presence that has created
a large following for the competition
</li>
<li>Training participants on communications, outreach, and
media training
</li>
<li>Disseminating publicly important data generated from the
competition
</li>
</ul>
<p>Further, teams have been required per the rules to <a
href="http://waveenergyprize.org/teams/update">communicate
publicly</a> on the website about their progress and to
speak about the challenge and their participation in their
own words. Participants put a lot of blood, sweat and tears
into a competition, and it is important to shed light on
their stories and why they are participating.</p>
<p>Below are details on how the prize team has worked to ensure
a successfully executed communications plan.</p>
<p><strong>Strong relationships with different communications
teams in DOE</strong>: The Wave Energy Prize has leveraged
the fact that different communications teams within DOE have
different audiences and outlets, and the prize team has
tailored content to the audiences reached by different
communications teams. Both the DOE prize team and the prize
administration team have worked to understand how different
communications teams in DOE from the Office of Energy
Efficiency and Renewable Energy and DOE Public Affairs work
and what their protocols are.</p>
<p><strong>Synergistic communications plans and
cross-promotion</strong>: The Wave Energy Prize is working
off a communications and outreach plan in which DOE prize
team communications and prize administration team
communications are synergistic, and there is a clear
delineation of the kinds of communications that come from
each of the two teams. All communications that the DOE prize
team and the prize administration team put out are planned
one month in advance to ensure momentum. Media coverage,
monthly newsletters and blogs and team features come from
the prize administration team; synthesis and reflection
pieces are written by the DOE prize team; and press releases
at key stages of the competition are published by both
teams. The prize administration team promotes all
communications relating to the prize coming from any DOE
office.</p>
<p><strong>Writing for multiple audiences and simple
communication</strong>: The Wave Energy Prize aims to
mobilize new and existing talent, increase visibility and
attract potential investors. To achieve these objectives, it
has been important to communicate goals, requirements,
processes and progress in the simplest language possible,
while being careful not to dilute the technical details the
prize team worked hard to solidify. The prize team has been
able to create content for those with varying degrees of
interest and expertise, as described below.</p>
<p>The goal of the Wave Energy Prize is to double the
state-of-the-art performance of wave energy converters as
measured through a metric called ACE, short for average
climate capture width per characteristic capital
expenditure—clearly a mouthful, and entirely jargon. ACE is
a measure of the effectiveness of a WEC at absorbing power
from the incident wave energy field divided by a measure of
the capital expenditure in commercial production of the load
bearing device structure. So how can that be said
simply?</p>
<p>Here is how this metric is communicated to the public: "ACE
is determined by dividing, in essence, the wave energy
extraction efficiency of a wave energy converter by its
structural cost." This language is simple enough that makes
the key metric of the Wave Energy Prize understandable to
many more people than just experts in wave energy. And the
prize team has been directing those interested in more of
the details to detailed blog posts with more technical
depth.</p>
<p><strong>Tracking impact and creating an archive: </strong>The
prize administration team has tracked all key statistics
pointing to the impact of communications efforts, including
visitor numbers; click-through rates; time spent on
webpages; geographic location; browser type; social media
impressions on Facebook, LinkedIn and Twitter; and so on.
The prize administration team provides monthly reports to
the DOE prize team on these statistics, and changes to
communications plans are made accordingly.</p>
<p>Also, since Wave Energy Prize is unfolding over 18 months,
the DOE prize team and prize administration team archive all
outreach and media content on the <a
href="http://waveenergyprize.org/newsroom"><em>Newsroom</em></a>
page of the competition website, creating a narrative arc
for those interested.</p>
<p>The DOE Wave Energy Prize won GSA Challenge.gov's Five Years
of Excellence in Federal Challenge & Prize Competition
Award for Best Challenge Engagement Strategy, and its
communications plan was a key part of its success. The Wave
Energy Prize has tracked analytics on its various social
media platforms and its website, and developed a two-track
communications plan, one executed by the contractor selected
to administer the prize, and one executed by the
program-level agency team.</p>
<p><strong>Area of Excellence #4: "Accept
Solutions"</strong></p>
<p><strong>Registration and eligibility</strong>: The Wave
Energy Prize registration period started April 27, 2015, and
the end date was extended from June 15, 2015, to June 30,
2015, to maximize the number of teams entering the prize
funnel. During this extended window, registration jumped
from 50 teams on June 15 to 92 on June 30.</p>
<p>The terms and conditions specified the eligibility and
participation requirements per the America COMPETES
Reauthorization Act. These requirements spanned
indemnification and liability, insurance, and citizenship or
private company incorporation, among other things. When
participants registered, they were required to provide
documentation to attest to their eligibility, allowing the
prize administration team to only put the submissions
eligible to win through the time-intensive judging
process.</p>
<p><strong>Receiving and managing solutions</strong>: Those
participants making it through the Wave Energy Prize funnel
will have submitted electronic documentation (including
writing and drawings), codes and numerical modeling output
and device prototypes over the course of the competition.
All along, participants have been provided clear guidance on
how and when to submit their solutions, and the prize team
has ensured that any and all solutions provided by teams are
dealt with per the terms and conditions. The prize
administration team has come up with detailed plans for who
on the prize team handles what submission and when. Below
are additional steps and policies put in place to ensure
successful submissions from participants:</p>
<ul>
<li>Participants submit all electronic documents on the Wave
Energy Prize website back end—custom-built for the prize
based on open source software—where the judging panel is
able to access them to complete their evaluations.
</li>
<li>The prize administration team worked with the
small-scale testing facilities to ensure the qualified
teams' 1/50th scale devices were properly received.
</li>
<li>For the 1/20th scale prototypes, the prize
administration team worked with a third party shipping
company to pick up and transfer devices from the
contiguous United States to the MASK Basin in Carderock,
Maryland.
</li>
<li>During the entire competition, participants have been
able to ask questions of the judges and technical
experts—through the prize administration team—to make
sure that any actions taken by the participants were
allowed per the rules and testing requirements.
</li>
</ul>
<p><strong>Nondisclosure agreements and intellectual
property</strong>: The entire prize team (consisting of the
DOE, prize administration team, judges, Carderock, the
National Renewable Energy Laboratory and Sandia National
Laboratories) have signed or are bound by nondisclosure
agreements to ensure that all intellectual property of the
participants remains theirs. Further, all data and testing
documentation generated during the competition will be made
public on the DOE Marine and Hydrokinetic Data Repository in
November 2017, one year after winners are announced. The
documentation will be scrubbed of any intellectual property,
such as the detailed design drawings of solutions the
participants brought as their original concepts into the
competition. All details have been specified in the terms
and conditions.</p>
<p><strong>Area of Excellence #5: "Pay Winners"</strong></p>
<p>Since the Wave Energy Prize is being executed under a
financial assistance agreement with the prize administration
team, it is the prize administration team (contractors) that
is responsible for disbursing both seed funding to the
finalists and alternates and the prize purse. These funds
were allocated to the prize administration team after
go/no-go decisions on the continuation of the Wave Energy
Prize program. The prize purse will be disbursed to the team
leader(s) of the winning team(s).</p>
<h2>Challenge Type</h2>
<p><strong>Technology</strong></p>
<p>Given the early maturity level of the wave energy industry,
the Wave Energy Prize has been structured and rules written
to maximize the diversity of devices that can be tested and
evaluated, and to allow for flexibility in dealing with
WECs. Thousands of hours have been put in by the technical
experts (National Renewable Energy Laboratory, Sandia
National Laboratories, Carderock, University of Michigan,
University of Maine, University of Iowa, Stevens Institute
of Technology, Oregon State University and judges) over the
course of the competition to ensure the proper planning and
execution of the 1/50th and 1/20th scale testing programs.
This has instilled confidence in the participants of the
rigor of the competition and demonstrated to industry the
level of effort required to get apples-to-apples comparisons
of WEC performance.</p>
<h2>Legal Authority</h2>
<p>America COMPETES Reauthorization Act</p>
<h2></h2>
<h2>Challenge Website</h2>
<p><a href="http://waveenergyprize.org/newsroom">www.waveenergyprize.org</a>
</p>
</div>
</div>
</div>
</section>
</div>
</section>
</section>
</div>
</div>
<!-- CLOSE article -->
</article>
</div>
<!--// WordPress Hook //-->
<!--// CLOSE #page-wrap //-->
| 46.409567 | 265 | 0.798828 | eng_Latn | 0.998815 |
3bcb204da9482c9e1a2bb82c316e638b3c97b2ea | 118 | md | Markdown | README.md | publicdomain/consolidate-directory | 8739f6da6b5e23b7654ee13fdeeea24d5d4ba162 | [
"CC0-1.0"
] | null | null | null | README.md | publicdomain/consolidate-directory | 8739f6da6b5e23b7654ee13fdeeea24d5d4ba162 | [
"CC0-1.0"
] | null | null | null | README.md | publicdomain/consolidate-directory | 8739f6da6b5e23b7654ee13fdeeea24d5d4ba162 | [
"CC0-1.0"
] | null | null | null | # consolidate-directory
Consolidates all files in a directory (including subdirectories). Can be filtered by pattern.
| 39.333333 | 93 | 0.822034 | eng_Latn | 0.998899 |
3bcb3cb364bb7a43ebf1f18b40e61e03ce1dc72b | 7,836 | md | Markdown | _posts/computer-science/hardware/2021-11-09-cpu.md | KelipuTe/KelipuTe.github.io | 36eac8eabd7dde8f1bcc521dd6ec0afd3c488de1 | [
"MIT"
] | 1 | 2017-09-21T13:56:05.000Z | 2017-09-21T13:56:05.000Z | _posts/computer-science/hardware/2021-11-09-cpu.md | KelipuTe/KelipuTe.github.io | 36eac8eabd7dde8f1bcc521dd6ec0afd3c488de1 | [
"MIT"
] | null | null | null | _posts/computer-science/hardware/2021-11-09-cpu.md | KelipuTe/KelipuTe.github.io | 36eac8eabd7dde8f1bcc521dd6ec0afd3c488de1 | [
"MIT"
] | null | null | null | ---
title: "CPU(中央处理器)"
create_date: 2021-11-09 08:00:00 +0800
date: 2021-11-11 08:00:00 +0800
tags: computer-science hardware
comment: false
show_author_profile: true
show_subscribe: false
mathjax: true
---
- CPU(central processing unit、中央处理单元)
- arithmetic unit(运算器)
- ALU(Arithmetic and Logic Unit、算术逻辑单元)
- arithmetic unit(算术单元)
- half adder(半加器)
- full adder(全加器)
- 8-bit ripple carry adder(8 位行波进位加法器)
- carry look ahead adder(超前进位加法器)
- logic unit(逻辑单元)
- AC(累加寄存器)
- DR(数据缓冲寄存器)
- PSW(状态条件寄存器)
- control unit(控制器)
- IR(指令寄存器)
- PC(程序计数器)
- AR(地址寄存器)
- ID(指令译码器)
- register(寄存器)
- CPU cache(CPU 高速缓存)
- locality(局部性原理)
- temporal locality(时间局部性原理)
- spatial locality(空间局部性原理)
- cache line、cache block(缓存块)
- L1 Cache(一级缓存)
- I-Cache(指令缓存)
- D-Cache(数据缓存)
- L2 Cache(二级缓存)
- L3 Cache(三级缓存)
- cache hit(缓存命中)
- cache miss(缓存未命中)
- cache hit rate(缓存命中率)
- cache coherence(缓存一致性)
- write through(写直达)
- write back(写回)
- 多核心缓存一致性
- wreite propagation(写传播)
- bus snooping(总线嗅探)
- transaction serialization(事务串形化)
- MESI 协议
- program(程序)
- CPU 执行时间
- number of machine cycle(机器周期数)
- number of instruction(指令数)
- CPI(cycles per instruction、指令平均时钟周期数)
- clock cycle(时钟周期)
- instruction(指令)
- instruction cycle(指令周期)
- IF(instruction fetch、取指令)
- ID(instruction decode、指令译码)
- EX(execute、执行指令)
- machine cycle(机器周期)
- clock(时钟)
- clock rate(时钟频率)
- clock cycle(时钟周期)
- overclocking(超频)
- underclocking(降频)
- dynamic frequency scaling(动态频率调整)
- instruction pipeline(指令流水线)
- dependencies(依赖性)
- dynamically reorder(动态排序)
- out-of-order execution(乱序执行)
- branch prediction(分支预测)
- speculative execution(推测执行)
- superscalar processors(超标量处理器)
- multi-core processors(多核处理器)
- multiple independent CPU(多个独立 CPU)
### CPU
CPU 是计算机的核心,负责执行程序。
运算器和控制器组成 CPU 的核心部分。其他部分还有寄存器、CPU Cache(CPU 高速缓存)、总线等。
CPU 的强大之处在于 programmable(可编程的),如果写入不同的指令,就会执行不同的任务。
### 运算器
运算器执行所有的算数和逻辑运算并进行逻辑测试,如与、或、非等。
运算器由 ALU(算术逻辑单元)、AC(累加寄存器)、DR(数据缓冲寄存器)、PSW(状态条件寄存器) 等组成。
- ALU:对数据的算数和逻辑运算。
- AC:运算结果或者源操作数的存放区。
- DR:暂时存放内存的指令或数据。
- PSW:保存指令运行结果的条件码,如溢出标志。
#### ALU
ALU 由算术单元和逻辑单元组成。
ALU 一般都会支持的 8 个操作:
- add(加法)
- add with carry(带进位的加法)
- subtract(减法)
- subtract with borrow(带借位的减法)
- negate(取消)
- increment(增量、+1)
- decrement(减量、-1)
- pass through(数字无改变通过)
简单的 ALU 没有乘法和除法,乘法就是多次加法。高级的 ALU 有专门做乘法的算术单元。
ALU 需要三个输入:两个数据,一个操作代码。操作代码用于指示 ALU 做什么操作。
ALU 有多个输出:一个数据输出;多个 flag(标志)输出。
ALU 的标志输出有很多。OVERFLOW(溢出标志)、ZERO(零测试电路)、NEGATIVE(负标志),这三个是普遍使用的。
- 加法器的进位输出连接到溢出标志,如果溢出标志输出 true,表示有溢出。
- 计算 A-B 时,如果零测试电路输出 true,表示 A 和 B 相等。
- 计算 A-B 时,如果负标志输出 true,表示 A 小于 B。
#### 算术单元
算术单元负责对数据的算数。
- 半加器:实现 1 + 1 以内的加法。
- 全加器:处理超过 1 + 1 的运算。
- 8 位行波进位加法器:处理两个 8 bit 的二进制数的加法运算。
- 超前进位加法器):现代计算机的加法器,电路不同,速度更快。
#### 逻辑单元
逻辑单元负责判断,由大量的逻辑门组成。
### 控制器
控制器控制整个 CPU 的工作。
控制器由 IR(指令寄存器)、PC(程序计数器)、AR(地址寄存器)、ID(指令译码器) 等组成。
- IR:暂存 CPU 执行指令。
- PC:存放指令执行地址。
- AR:保存当前 CPU 所访问的内存地址。
- ID:分析指令操作码。
### CPU 高速缓存
CPU 高速缓存:在 CPU 里放一块内存用来暂存数据。
依据局部性原理,高速缓存从 RAM 拿数据时,一次拿一个缓存块大小的数据而不是一个数据。
在 Linux 中,/sys/devices/system/cpu/cpu0/cache/index0/coherency_line_size 可以查看 L1 缓存块的大小。
#### 一级缓存
一级缓存可分为指令缓存和数据缓存。
每个 CPU 核心都有属于自己的 L1。
在 Linux 中:
- /sys/devices/system/cpu/cpu0/cache/index0/size 可以查看 L1 数据缓存的大小
- /sys/devices/system/cpu/cpu0/cache/index1/size 可以查看 L1 指令缓存的大小
#### 二级缓存
每个 CPU 核心都有属于自己的 L2。
在 Linux 中,/sys/devices/system/cpu/cpu0/cache/index2/size 可以查看 L2 的大小。
#### 三级缓存
L3 高速缓存通常是多个 CPU 核心共用。
在 Linux 中,/sys/devices/system/cpu/cpu0/cache/index3/size 可以查看 L3 的大小。
#### 缓存命中率
- 数据缓存的命中率:尽量顺序访问数据。
- 指令缓存的命中率:尽量访问有序的数据(让分支预测更准确)。
- 多核 CPU 缓存命中率:把计算密集型程序的线程绑定在某个 CPU 核心上。对于多核心 CPU,线程可能在不同 CPU 核心来回切换,这对属于每个核心的 L1 和 L2 不利,但是对 L3 没影响。
在 Linux 中,提供了 sched_setaffinity,来实现将线程绑定到某个 CPU 核心这一功能。
#### 缓存一致性
缓存一致性有两个解决方案:写直达、写回。
写直达:当发生写操作时,把数据同时写入缓存和内存。
CPU 写入前先判断缓存里有没有数据。如果缓存里有,就先更新到缓存,再写入内存。如果缓存里没有,就直接更新到内存。
写回:当发生写操作时,新的数据只被写入到缓存块里,只有当修改过的缓存块被替换时,才需要写到内存中。
如果发生写操作时,数据已经在缓存里,则把数据更新到缓存,同时标记缓存里的这个缓存块为 Dirty(脏)的。这个脏的标记代表这个时候,缓存里面的这个缓存块的数据和内存是不一致的。
如果发生写操作时,要写入数据的缓存块里存放的是别的内存地址的数据,就要检查这个缓存块有没有被标记为脏的。
- 如果是脏的话,就要把这个缓存块里的数据写回到内存,然后再把当前要写入的数据,先从内存读入到缓存里,然后把数据写入到缓存块,最后把它标记为脏的。
- 如果不是脏的话,就直接将数据写入到这个缓存块里,然后把这个缓存块标记为脏的。
#### 多核心缓存一致性
多核心 CPU 的每个核心都有自己的 L1 和 L2。写直达和写回方案,只能解决单核心的问题。
要解决多核心的问题,需要做到下面两点:
- 写传播:某个核心里的缓存更新时,必须传播到其他核心。用于解决下面的情况 1。
- 事务串形化:某个核心里对数据的操作顺序,在其他核心看起来必须是一样的。用于解决下面的情况 2。
<div style="text-align: center; margin: 5px auto">
<img src="/image/computer-science/hardware/wreite_propagation.drawio.png">
</div>
<div style="text-align: center; margin: 5px auto">
<img src="/image/computer-science/hardware/transaction_serialization.drawio.png">
</div>
总线嗅探:当某个核心更新了缓存,要把该事件广播到其他核心。核心需要每时每刻监听总线上的一切活动,不管别的核心的缓存是否缓存相同的数据,都需要发出一个广播事件。
总线嗅探并不能保证事务串形化。要实现事务串形化,需要做到下面两点:
- CPU 核心对于缓存中数据的操作,需要同步给其他 CPU 核心。
- 引入锁的概念,如果两个 CPU 核心里有相同数据的缓存,那么只有拿到了锁,才能进行对应的数据更新。
##### MESI 协议
基于总线嗅探机制实现了事务串形化,同时利用状态机机制降低了总线带宽压力。
MESI 是 4 个单词的缩写:Modified(已修改)、Exclusive(独占)、Shared(共享)、Invalidated(已失效)。
- 已修改状态就是前面提到的脏标记,代表缓存块上的数据已经被更新,但还没有写到内存。
- 已失效状态表示的是这个缓存块里的数据已经失效了,不可以读取该状态的数据。
- 独占和共享状态都代表缓存块里的数据是干净的,和内存里面的数据是一致性的。
- 独占状态的时候,数据只存储在一个 CPU 核心的缓存里,而其他 CPU 核心的缓存没有该数据。这个时候,不存在缓存一致性的问题。所以可以直接自由地写入,而且不需要通知其他 CPU 核心。
- 在独占状态下的数据,如果有其他核心从内存读取了相同的数据到各自的缓存,那么这个时候,独占状态下的数据就会变成共享状态。
- 共享状态代表着相同的数据在多个 CPU 核心的缓存里都有,所以当要更新缓存里面的数据的时候,不能直接修改,而是要先向所有的其他 CPU 核心广播一个请求,要求先把其他核心的缓存中对应的缓存块标记为已失效状态,然后再更新当前缓存里面的数据。
<div style="text-align: center; margin: 5px auto">
<img src="/image/computer-science/hardware/cache_mesi.drawio.png">
</div>
### 程序
$程序的 CPU 执行时间 = 程序的机器周期数 \times 时钟周期$
$程序的机器周期数 = 程序的指令数 \times 指令平均时钟周期数$
### 指令
程序由一系列操作组成,这些操作就叫指令。
指令指示计算机要做什么。如:算数加法、算数减法、去内存中读、写数据等。
指令由 operation code(操作代码)和数据来源(内存地址、寄存器 ID 等)组成。
指令和数据都可以以二进制的形式存储在内存里。
### 指令周期
指令周期:CPU 执行一条指令所需的时间。
先假设一个简单的 CPU 和一个简单的 RAM:
- 算术逻辑单元
- 控制单元
- 两个 8 位寄存器(A、B):临时存储数据
- 指令地址寄存器:存储当前指令的内存地址
- 指令寄存器:存储当前指令
- 指令表(instruction set、指令集)
假设指令是 8 位的,前四位存储操作代码,后四位存储内存地址或寄存器。一共有四个操作:
- 读取内存数据放入寄存器 A
- 读取内存数据放入寄存器 B
- 将寄存器 B 的数据加入寄存器 A
- 将寄存器 A 的数据存入内存
控制单元由一堆逻辑门构成,用途就是判断指令的操作码是指令表上的哪一个。
假设 RAM 有 16 个位置,每个位置可以放 8 bit 的数据。
#### 一条指令的执行过程
执行一条指令的过程:取指令、指令译码、执行指令。
初始化时,所有的寄存器都初始化为 0;RAM 中存入一段程序,假设地址从 0 开始。
取指令阶段,负责拿到指令。CPU 将指令地址寄存器连接到 RAM,寄存器的值为 0,因此 RAM 返回地址 0 上的数据。返回的数据会被复制到指令寄存器里。
指令译码阶段,搞清楚指令要干什么。CPU 用控制单元判断指令寄存器里的指令的操作码是哪一个。
假设译码的结果为:操作码:读取内存数据放入寄存器 A;地址:9。
执行阶段,执行译码的结果。从 RAM 地址 9 的位置取出操作数,放到寄存器 A 里。
执行完成后,关闭所有电路,指令地址寄存器 +1,本次指令流程结束。
CPU 会根据指令周期的不同阶段区分取指令还是取操作数。
#### 一个操作的执行过程
假设整个操作流程是两个数字相加。
第一步、从 RAM 取数据放到寄存器 A。
第二步、从 RAM 取数据放到寄存器 B。
第三步、将寄存器 A 和寄存器 B 中的数字相加后放入寄存器 A。这时控制单元会将两个数据和加法指令交给 ALU,进行算术逻辑运算。但是运算后的结果不能直接放到寄存器 A 里去,因为在电路没有断开时,寄存器 A 里的数据会持续输入到 ALU 中。所以控制单元或用一个自己的寄存器暂时保存结果,然后关闭 ALU,最后把结果放入寄存器 A。
第四步、将寄存器 A 中的数据存入 RAM。
### 机器周期
机器周期(又叫 CPU 周期):CPU 完成一个基本操作所需的时间。
一条指令的执行过程划分为若干个阶段,每一阶段完成一个基本操作。
### 时钟
时钟负责管理 CPU 的工作节奏。时钟以精确的时间间隔触发脉冲信号,控制单元会用这个信号推进 CPU 的内部操作。节奏不能太快,因为电的传输需要时间。
- 时钟频率,单位是 Hz(hertz、赫兹),10Hz 代表时钟 1 秒触发脉冲信号 10 次。
- 时钟周期(又叫振荡周期):每一次脉冲信号高低电平的转换就是一个周期,时钟频率的倒数。
- 时钟频率越高,时钟周期越短,CPU 工作速度越快。
- 超频可以提升 CPU 的工作效率,但是会产生散热问题或因为跟不上时钟频率产生乱码。
- 降频可以省电,CPU 不需要时刻保持全速工作。
### 指令流水线
指令流水线:把一条指令的操作分成多个小步骤,每个步骤由专门的电路完成。再使用 parallelize (平行)处理的方式提高指令的执行效率。
- 高级 CPU 会动态排序有依赖关系的指令,然后乱序执行(和原来的顺序相比)。
- 高级 CPU 会使用分支预测技术,提前把指令放入流水线。
### 其他提升性能的方法
- 超标量处理器:一个时钟周期完成多个指令。
- 多核处理器:一个 CPU 有多个内核,可同时处理多个指令流。
- 多个独立 CPU:多个独立 CPU 同时工作。
### 参考
- Crash Course Computer Science(计算机科学速成课)
- [bilibili](https://www.bilibili.com/video/BV1EW411u7th)
- [CrashCourse 字幕组](https://github.com/1c7/crash-course-computer-science-chinese)
- [Youtube 原视频](https://www.youtube.com/playlist?list=PL8dPuuaLjXtNlUrzyH5r6jN9ulI)
- [小林coding](https://xiaolincoding.com/)
- [图解系统](https://xiaolincoding.com/os/) | 22.135593 | 168 | 0.735835 | yue_Hant | 0.775714 |
3bcce47ce17d4dc6d6f530d5733676cf06d5b1da | 79 | md | Markdown | README.md | loneicewolf/Technical_Template_1 | 1fca6eb5c377084415f7de9693eadaff4856b1e8 | [
"MIT"
] | 1 | 2021-02-09T09:08:23.000Z | 2021-02-09T09:08:23.000Z | README.md | loneicewolf/Technical_Template_1 | 1fca6eb5c377084415f7de9693eadaff4856b1e8 | [
"MIT"
] | null | null | null | README.md | loneicewolf/Technical_Template_1 | 1fca6eb5c377084415f7de9693eadaff4856b1e8 | [
"MIT"
] | null | null | null | # Technical_Template_1
A template for myself to be used when creating Gits. \0
| 26.333333 | 55 | 0.797468 | eng_Latn | 0.997782 |
3bccec503321c9d7016cc9b234cddbe787e9bd81 | 1,196 | md | Markdown | rivernet_prep/README_rivernet-prep.md | BrisClimate/flood-cascade | 660c29275a87785153d0f107ed23104fcbcbddee | [
"MIT"
] | null | null | null | rivernet_prep/README_rivernet-prep.md | BrisClimate/flood-cascade | 660c29275a87785153d0f107ed23104fcbcbddee | [
"MIT"
] | null | null | null | rivernet_prep/README_rivernet-prep.md | BrisClimate/flood-cascade | 660c29275a87785153d0f107ed23104fcbcbddee | [
"MIT"
] | 3 | 2020-11-08T16:01:47.000Z | 2021-01-13T17:13:32.000Z | ### River network generation.
1. Use LFPTools (and TauDEM) to generate river network.
- 1a. `lfp-prepdata -i GBM_splitd8.cfg`
- 1b. `lfp-split -i GBM_splitd8.cfg`
- EXAMPLE configuration file to define Ganges-Brahmaputra-Meghna (GBM) stream network using MERIT-hydro as input: `GBM_splitd8.cfg`. NOTE basin 077 was manually determined to correspond to the GBM basin by checking values in 'basins3.tif'.
- See installation notes in LFPtools repository
2. (Optional): Downsample river network to lower horizontal resolution for LISFLOOD-FP simualtions. (scripts originally from https://github.com/pfuhe1/downsample_hydro).
- 2a. `python downsample_hydro.py --nwindow 3 --count_thresh 2 --max_start_acc 510 --datadir <DATADIR>`
- 2b. `python call_streamnet_downsample.py --res '9s' --datadir <DATADIR>`
- DATADIR is the location of the river basin extracted by the 'lfp-split' script e.g. outdir in `GBM_splitd8.cfg`
3. `lisflood_discharge_inputs_qgis.py`:
Run in qgis python console to create shapefiles representing upstream accumulation for vertices in the river network. Used by `calc_q_returnperiod.py` and `lisflood_setup_bankfullQ.py` in `submit_scripts/lisflood-fp` folder.
| 66.444444 | 240 | 0.776756 | eng_Latn | 0.885241 |
3bd173dfee62513a3b0574ec7891c7b4a0a40347 | 4,013 | md | Markdown | docs/framework/unmanaged-api/profiling/icorprofilerinfo6-enumngenmodulemethodsinliningthismethod-method.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/profiling/icorprofilerinfo6-enumngenmodulemethodsinliningthismethod-method.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/profiling/icorprofilerinfo6-enumngenmodulemethodsinliningthismethod-method.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Método ICorProfilerInfo6::EnumNgenModuleMethodsInliningThisMethod
ms.date: 03/30/2017
ms.assetid: b933dfe6-7833-40cb-aad8-40842dc3034f
ms.openlocfilehash: 8ed3f305deceacb976aeff994db1588f9e1ce1fb
ms.sourcegitcommit: da21fc5a8cce1e028575acf31974681a1bc5aeed
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 06/08/2020
ms.locfileid: "84495522"
---
# <a name="icorprofilerinfo6enumngenmodulemethodsinliningthismethod-method"></a>Método ICorProfilerInfo6::EnumNgenModuleMethodsInliningThisMethod
Retorna um enumerador para todos os métodos que são definidos em um determinado módulo NGen e embutidos em um determinado método.
## <a name="syntax"></a>Sintaxe
```cpp
HRESULT EnumNgenModuleMethodsInliningThisMethod(
[in] ModuleID inlinersModuleId,
[in] ModuleID inlineeModuleId,
[in] mdMethodDef inlineeMethodId,
[out] BOOL *incompleteData,
[out] ICorProfilerMethodEnum** ppEnum
);
```
## <a name="parameters"></a>Parâmetros
`inlinersModuleId`\
no O identificador de um módulo NGen.
`inlineeModuleId`\
no O identificador de um módulo que define `inlineeMethodId` . Para obter mais informações, consulte a seção Comentários.
`inlineeMethodId`\
no O identificador de um método embutido. Para obter mais informações, consulte a seção Comentários.
`incompleteData`\
fora Um sinalizador que indica se `ppEnum` o contém todos os métodos que indefinem um determinado método. Para obter mais informações, consulte a seção Comentários.
`ppEnum`\
fora Um ponteiro para o endereço de um enumerador
## <a name="remarks"></a>Comentários
`inlineeModuleId`e, `inlineeMethodId` juntos, formam o identificador completo do método que pode ser embutido. Por exemplo, suponha que `A` o módulo defina um método `Simple.Add` :
```csharp
Simple.Add(int a, int b)
{ return a + b; }
```
e o módulo B define `Fancy.AddTwice` :
```csharp
Fancy.AddTwice(int a, int b)
{ return Simple.Add(a,b) + Simple.Add(a,b); }
```
Vamos supor também que `Fancy.AddTwice` o embutirá a chamada para `SimpleAdd` . Um criador de perfil pode usar esse enumerador para localizar todos os métodos definidos no módulo B que está embutido `Simple.Add` e o resultado seria enumerado `AddTwice` . `inlineeModuleId`é o identificador do módulo `A` e `inlineeMethodId` é o identificador de `Simple.Add(int a, int b)` .
Se `incompleteData` for true Depois que a função retornar, o enumerador não conterá todos os métodos inalinhando um determinado método. Isso pode acontecer quando uma ou mais dependências diretas ou indiretas do módulo inlineers ainda não foram carregadas. Se um criador de perfil precisar de dados precisos, ele deverá tentar novamente mais tarde quando mais módulos forem carregados, preferencialmente em cada carregamento de módulo.
O `EnumNgenModuleMethodsInliningThisMethod` método pode ser usado para solucionar limitações de inalinhamento para ReJIT. O ReJIT permite que um criador de perfil altere a implementação de um método e, em seguida, crie um novo código para ele em tempo real. Por exemplo, poderíamos mudar da `Simple.Add` seguinte maneira:
```csharp
Simple.Add(int a, int b)
{ return 42; }
```
No entanto `Fancy.AddTwice` , como já foi embutido `Simple.Add` , ele continua tendo o mesmo comportamento que antes. Para contornar essa limitação, o chamador precisa pesquisar todos os métodos em todos os módulos que embutidos `Simple.Add` e usar `ICorProfilerInfo5::RequestRejit` em cada um desses métodos. Quando os métodos forem compilados novamente, eles terão o novo comportamento do `Simple.Add` em vez do comportamento antigo.
## <a name="requirements"></a>Requisitos
**Plataformas:** confira [Requisitos do sistema](../../get-started/system-requirements.md).
**Cabeçalho:** CorProf. idl, CorProf. h
**Biblioteca:** CorGuids.lib
**.NET Framework versões:**[!INCLUDE[net_current_v46plus](../../../../includes/net-current-v46plus-md.md)]
## <a name="see-also"></a>Confira também
- [Interface ICorProfilerInfo6](icorprofilerinfo6-interface.md)
| 46.126437 | 435 | 0.774981 | por_Latn | 0.990795 |
3bd1dfa46b32bf319c15d41b66a76b8f1ceedbc4 | 1,404 | md | Markdown | README.md | msales/logged | 9a6eb242ce079b63d27033e6f7524f4490dbcc1a | [
"MIT"
] | null | null | null | README.md | msales/logged | 9a6eb242ce079b63d27033e6f7524f4490dbcc1a | [
"MIT"
] | 2 | 2018-11-20T13:29:28.000Z | 2020-09-09T15:12:56.000Z | README.md | msales/logged | 9a6eb242ce079b63d27033e6f7524f4490dbcc1a | [
"MIT"
] | null | null | null | # logged
[![Go Report Card](https://goreportcard.com/badge/github.com/msales/logged)](https://goreportcard.com/report/github.com/msales/logged)
[![Build Status](https://travis-ci.org/msales/logged.svg?branch=master)](https://travis-ci.org/msales/logged)
[![Coverage Status](https://coveralls.io/repos/github/msales/logged/badge.svg?branch=master)](https://coveralls.io/github/msales/logged?branch=master)
[![GoDoc](https://godoc.org/github.com/msales/logged?status.svg)](https://godoc.org/github.com/msales/logged)
[![GitHub release](https://img.shields.io/github/release/msales/logged.svg)](https://github.com/msales/logged/releases)
[![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://raw.githubusercontent.com/msales/logged/master/LICENSE)
A fast logger for Go.
## Overview
Install with:
```shell
go get github.com/msales/logged
```
## Examples
```go
// Composable handlers
h := logged.LevelFilterHandler(
logged.Info,
logged.StreamHandler(os.Stdout, logged.LogfmtFormat()),
)
// The logger can have an initial context
l := logged.New(h, "env", "prod")
// All messages can have a context
l.Warn("connection error", "redis", conn.Name(), "timeout", conn.Timeout())
```
Will log the message
```
lvl=warn msg="connection error" redis=dsn_1 timeout=0.500
```
## License
MIT-License. As is. No warranties whatsoever. Mileage may vary. Batteries not included. | 31.909091 | 150 | 0.735043 | yue_Hant | 0.428051 |
3bd210698f0d321673a6ce13aacacbe156531f20 | 2,171 | md | Markdown | articles/azure-government/documentation-government-services-securityandidentity.md | OpenLocalizationTestOrg/azure-docs-pr15_de-AT | ca82887d8067662697adba993b87860bdbefea29 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 1 | 2020-11-29T22:55:06.000Z | 2020-11-29T22:55:06.000Z | articles/azure-government/documentation-government-services-securityandidentity.md | Allyn69/azure-docs-pr15_de-CH | 211ef2a7547f43e3b90b3c4e2cb49e88d7fe139f | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/azure-government/documentation-government-services-securityandidentity.md | Allyn69/azure-docs-pr15_de-CH | 211ef2a7547f43e3b90b3c4e2cb49e88d7fe139f | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | 2 | 2019-07-03T20:05:49.000Z | 2020-11-29T22:55:15.000Z | <properties
pageTitle="Azure Regierung Dokumentation | Microsoft Azure"
description="Dies bietet einen Vergleich der Features und Hinweise auf die Anwendungsentwicklung für Azure"
services="Azure-Government"
cloud="gov"
documentationCenter=""
authors="ryansoc"
manager="zakramer"
editor=""/>
<tags
ms.service="multiple"
ms.devlang="na"
ms.topic="article"
ms.tgt_pltfrm="na"
ms.workload="azure-government"
ms.date="10/12/2016"
ms.author="ryansoc"/>
# <a name="azure-government-security-and-identity"></a>Azure Regierung und Identität
## <a name="key-vault"></a>Key Vault
Einzelheiten zu diesen Dienst und dessen Verwendung finden Sie die <a href="https://azure.microsoft.com/documentation/services/key-vault">Öffentliche Dokumentation Azure Key Vault.</a>
### <a name="data-considerations"></a>Aspekte der Daten
Die folgenden Informationen identifizieren Azure Government Grenze Azure Key Vault:
| Reguliert/gesteuert Daten erlaubt | Reguliert/gesteuert Daten nicht zulässig |
|--------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Alle mit einer Azure Key Vault-Schlüssel verschlüsselte Daten enthalten Daten reguliert/gesteuert. | Azure Key Vault Metadaten darf nicht Exportkontrolle Daten enthalten. Zu diesen Metadaten gehören alle Konfigurationsdaten beim Erstellen und Verwalten von Ihrem Tresor Schlüssel eingegeben. Regulierte/gesteuert Daten nicht in die folgenden Felder eingeben: Ressourcengruppennamen Key Vault Namen, Namen |
Key Vault ist erhältlich in Azure Government. In der Öffentlichkeit ist es keine Erweiterung Key Vault über PowerShell und CLI nur.
## <a name="next-steps"></a>Nächste Schritte
Zusätzliche Informationen und Updates Abonnieren der <a href="https://blogs.msdn.microsoft.com/azuregov/">Microsoft Azure Regierung Blog.</a>
| 57.131579 | 409 | 0.630124 | deu_Latn | 0.797897 |
3bd33ecfb4711f637ecaf1474204592bc382b3b2 | 2,838 | md | Markdown | PLANNING.md | timho890000/freestyle-project | f4da984b72a3ecc9ce8b06eab0f41573a6881f71 | [
"MIT"
] | 1 | 2018-06-17T19:57:15.000Z | 2018-06-17T19:57:15.000Z | PLANNING.md | timho890000/freestyle-project | f4da984b72a3ecc9ce8b06eab0f41573a6881f71 | [
"MIT"
] | null | null | null | PLANNING.md | timho890000/freestyle-project | f4da984b72a3ecc9ce8b06eab0f41573a6881f71 | [
"MIT"
] | 3 | 2018-06-20T23:23:03.000Z | 2018-06-25T19:06:27.000Z | # Project Planning
## Problem Statement
Primary User: Individuals with a high volume of expenses that are higher than expected, but no time to keep track of them all
User's Needs: An easy way to see what the user is spending on, and how much in total for a certain category, this way spending can show some constraint going forward.
Application: The application would give a quick easy summary of how much spending a person does over a certain time period for a certain category. It will also tell the user how much more he should spend to be in line of his goal.
Functionality: The application should retrieve data from the user's financial institutions and organize them in way to provide user with useful information. Then the program should give recommendations.
Improvements: Instead of needing to log into each account, and sorting/added up transactions to get you what you want, the app can easily do all that and spit out a recommendation for you to either show some spending restraint, or be free to spend more.
## Information Requirements
### Information Inputs
Inputs: User inputs would include a date range, a financial instituion, or a category of spending that you want to analyze (or all categories!)
Data inputs include transaction data from user's financial institutions such as credit card companies and banks.
This information will come from Buxfer.com, which is a website that you can link all your financial institution accounts to. You can also upload statements individually to analyze.
All information will be dollar amounts, as expenses are all going to be in USD.
There may be some initial setup required, such as logging into your bank account, or putting in a statement.
### Information Outputs
Outputs:Outputs would be in the format of US dollars and strings (such as the category of spending)
It will take data, most likely use a sum function, or a division function (to find % of budget left) and spit out a relevant number.
## Technology Requirements
### APIs and Web Service Requirements
I would use the Buxfer APIP (https://www.buxfer.com/help/api)
It would be used to grab financial data from the user's institutions which can be used to do the analysis.
I have not looked deep into the documentatin nor have I tried using them, but a quick glance seems to be very straightforward
information.
### Python Package Requirements
Third party Python package:
The Requests package will be used to to grab information from the website.
The Pytest package may be used to test the program.
pip install requests
pip install Pytest
### Hardware Requirements
I will be running the program on my own local computer, as I do not know how to do it on a public server yet.
| 45.047619 | 253 | 0.757576 | eng_Latn | 0.999589 |
3bd3f2c8a28af85b4c6b9394a95368f220debb27 | 1,285 | md | Markdown | tables/Prj.md | NTBIS-user/model | 833cc999bb6b908a16fed3e5676da74bc6254cf1 | [
"CC-BY-4.0"
] | null | null | null | tables/Prj.md | NTBIS-user/model | 833cc999bb6b908a16fed3e5676da74bc6254cf1 | [
"CC-BY-4.0"
] | null | null | null | tables/Prj.md | NTBIS-user/model | 833cc999bb6b908a16fed3e5676da74bc6254cf1 | [
"CC-BY-4.0"
] | 2 | 2021-09-30T12:49:28.000Z | 2021-11-26T10:11:05.000Z | # Prj
* [Prj_Project_Participants](Prj_Project_Participants.md)
* [Prj_Project_Property_Values](Prj_Project_Property_Values.md)
* [Prj_Project_Risk_Discussion](Prj_Project_Risk_Discussion.md)
* [Prj_Project_Risks](Prj_Project_Risks.md)
* [Prj_Project_Task_Dependancies](Prj_Project_Task_Dependancies.md)
* [Prj_Project_Task_Materials](Prj_Project_Task_Materials.md)
* [Prj_Project_Task_Participants](Prj_Project_Task_Participants.md)
* [Prj_Project_Task_Property_Values](Prj_Project_Task_Property_Values.md)
* [Prj_Project_Task_Resources](Prj_Project_Task_Resources.md)
* [Prj_Project_Tasks](Prj_Project_Tasks.md)
* [Prj_Project_Work_Elements](Prj_Project_Work_Elements.md)
* [Prj_Projects](Prj_Projects.md)
* [Prj_Resources](Prj_Resources.md)
* [Prj_Task_Types](Prj_Task_Types.md)
* [Prj_Template_Risks](Prj_Template_Risks.md)
* [Prj_Template_Work_Elements](Prj_Template_Work_Elements.md)
* [Prj_Templates](Prj_Templates.md)
* [Prj_Type_Roles](Prj_Type_Roles.md)
* [Prj_Type_Work_Elements](Prj_Type_Work_Elements.md)
* [Prj_Type_Work_Types](Prj_Type_Work_Types.md)
* [Prj_Types](Prj_Types.md)
* [Prj_Work_Report_Materials](Prj_Work_Report_Materials.md)
* [Prj_Work_Report_Resources](Prj_Work_Report_Resources.md)
* [Prj_Work_Reports](Prj_Work_Reports.md)
| 45.892857 | 74 | 0.825681 | yue_Hant | 0.811469 |
3bd413bcd7ef49cbdb680b8ecb597b8557ee74b3 | 2,110 | md | Markdown | src/lv/2020-04/09/06.md | Pmarva/sabbath-school-lessons | 0e1564557be444c2fee51ddfd6f74a14fd1c45fa | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/lv/2020-04/09/06.md | Pmarva/sabbath-school-lessons | 0e1564557be444c2fee51ddfd6f74a14fd1c45fa | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/lv/2020-04/09/06.md | Pmarva/sabbath-school-lessons | 0e1564557be444c2fee51ddfd6f74a14fd1c45fa | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: Mūsu Dzīves Līdzdalīšana
date: 26/11/2020
---
`Izlasi 1tes 2:6–8. Ko Pāvils pasaka šeit tādu, ko mēs varētu un ko mums vajadzētu atspoguļot savās skolās un draudzēs?`
Dzīvojot sabiedrībā, kurā sabrūk kopiena, mēs dzīvojam laikmetā, kurā Bībeles izpratne par draudzi nekad nav bijusi nozīmīgāka. Kā mums atgādina Mt 18:20: “Jo, kur divi vai trīs ir sapulcējušies Manā Vārdā, tur Es esmu viņu vidū.” Jaunās Derības vīzija par draudzi un kopienu ir veidojusies galvenokārt ticīgo mājās. Šeit kopiena satikās mazās grupās, lūdza, dziedāja, svinēja Kunga vakarēdienu, mācījās un līdzdalīja Jēzus vārdus cits citam. Šīs lūdzošās grupas arī kļuva par pirmajām draudzes skolām, tā kā šī bija vieta, kur jauni draudzes locekļi tika iepazīstināti ar Bībeli un ar jauno dzīvi, kas atrodama Jēzū. Pāvila raksti, tādi kā Rm 12:2 – “Un netopiet šai pasaulei līdzīgi, bet pārvērtieties, atjaunodamies savā garā, lai pareizi saprastu, kas ir Dieva griba: to, kas ir labs, tīkams un pilnīgs” – norāda, ka draudze šo izglītības darbu uzņēmās ļoti nopietni.
Šie agrīnie ticīgie drīz atklāja, ka evaņģēliju vislabāk ir izdzīvot kopienā. Kopienā mums ir iemesls dziedāt skaļāk, lūgt dedzīgāk un būt gādīgākiem un līdzjūtīgākiem. Kad mēs dzirdam citus runājam par Dieva labestību, arī mēs izjūtam, cik Viņš ir bijis pret mums labs. Kad mēs dzirdam par citu cīņām un sāpēm, mēs izjūtam Dieva dziedināšanu paši savā dzīvē un mēs piedzīvojam jaunas ilgas būt Viņa žēlastības un dziedināšanas instrumentiem.
Šodienas pantos Pāvils apgalvo, ka Dieva evaņģēlijs ir viss: krusta spēks, Kunga augšāmcelšanās, Viņa atgriešanās apsolījums. Visā pasaulē vienkārši nav labāku ziņu, un Pāvils pavadīja savu dzīvi, nodevies izaicinājumam līdzdalīt Jēzus stāstu ar vislielāko godprātīgumu un nodošanos.
Tomēr Pāvils ierosina, ka evaņģēlija vēsti vislabāk var saprast un piedzīvot, ja dalāmies ar savu pieredzi. Mēs nekad nedrīkstam aizmirst, ka cilvēki vēro mūsu dzīves un skatās, vai tās ilustrē žēlastības vēsti, kas ir atrodama Bībelē.
`Padomā dziļāk, kā tu dzīvo, un jautā sev: kāda liecība es esmu tiem, kas dzīvo ap mani?` | 131.875 | 871 | 0.804739 | lvs_Latn | 1.000009 |
3bd4e0d140e59fad65cc6faa842838d9438c3954 | 262 | md | Markdown | relm-examples/README.md | euclio/relm | 6b97cd493d27aa94dc0cfe04b552aba7ea27a1cb | [
"MIT"
] | 2,295 | 2017-02-13T16:16:20.000Z | 2022-03-31T13:06:21.000Z | relm-examples/README.md | euclio/relm | 6b97cd493d27aa94dc0cfe04b552aba7ea27a1cb | [
"MIT"
] | 257 | 2017-03-24T05:59:36.000Z | 2022-02-18T02:24:17.000Z | relm-examples/README.md | euclio/relm | 6b97cd493d27aa94dc0cfe04b552aba7ea27a1cb | [
"MIT"
] | 115 | 2017-03-01T09:39:48.000Z | 2022-01-28T19:32:24.000Z | # relm examples
A few relm examples. To build and run one of them, just do:
``` Shell
cargo run --example EXAMPLE-NAME
```
Please be sure to have all the required libraries installed before building the examples (see http://gtk-rs.org/docs/requirements.html). | 29.111111 | 136 | 0.751908 | eng_Latn | 0.996748 |
3bd4f53181927facd5cdf479fdcbe6685708b048 | 2,538 | md | Markdown | README.md | mchlbrnd/angular-promise | 74f8f8a4126eb922d6a44e742291be5e8163cc02 | [
"MIT"
] | 2 | 2015-09-08T10:54:25.000Z | 2015-09-12T00:26:16.000Z | README.md | mchlbrnd/angular-promise | 74f8f8a4126eb922d6a44e742291be5e8163cc02 | [
"MIT"
] | null | null | null | README.md | mchlbrnd/angular-promise | 74f8f8a4126eb922d6a44e742291be5e8163cc02 | [
"MIT"
] | null | null | null | #README is outdated and will be updated soon! :)#
Example at: http://michaelwolbert.nl/angular-css-promise/example/
#ngPromise attribute directive#
Adds classes to elements which reflect outcome of promise(s).
##Promise and Array of Promises ($q.all)##
```html
<element ng-promise="oneOrArray"/>
```
**Directive steps:**
1) Add ng-promise class
```html
<element class="ng-promise" ng-promise="oneOrArray"/>
```
2) Promise is excecuted and pending
```html
<element class="ng-promise-pending" ng-promise="oneOrArray"/>
```
3) Promise is resolved or reject and settles
```html
<element class="ng-promise-resolved" ng-promise="oneOrArraye"/>
<element class="ng-promise-rejected" ng-promise="oneOrArray"/>
```
##Object of Promises##
```javascript
$scope.object = {
first: Promise,
second: Promise,
third: Promise,
};
```
```html
<element ng-promise="object">
<element ng-promised="first"/>
<element ng-promised="second"/>
<element ng-promised="third"/>
</element>
```
*Directive steps:*
1) Add ng-promise class to element for each Promise in Object (note how nested ng-promised maps to the object keys)
```html
<element class="ng-promise-initial-first ng-promise-initial-second ng-promise-initial-third" ng-promise="object">
<element class="ng-promise-initial" ng-promised="first"/>
<element class="ng-promise-initial" ng-promised="second"/>
<element class="ng-promise-initial" ng-promised="third"/>
</element>
```
2) Object Promises are excecuted and pending
```html
<element class="ng-promise-pending-first ng-promise-pending-second ng-promise-pending-third" ng-promise="object">
<element class="ng-promise-pending" ng-promised="first"/>
<element class="ng-promise-pending" ng-promised="second"/>
<element class="ng-promise-pending" ng-promised="third"/>
</element>
```
3) Object Promises are resolved or reject, therafter settled
```html
<element class="ng-promise-resolved-first ng-promise-settled-first ng-promise-rejected-second ng-promise-settled-second ng-promise-resolved-third ng-promise-settled-third" ng-promise="object">
<element class="ng-promise-resolved ng-promise-settled" ng-promised="first"/>
<element class="ng-promise-rejected ng-promise-settled" ng-promised="second"/>
<element class="ng-promise-resolved ng-promise-settled" ng-promised="third"/>
</element>
```
##ngPromiseAnimateCssOptions attribute##
Bind options to pass to $animateCss calls.
```html
<element class="ng-promise-resolved" ng-promise="onePromise" ng-promise-animate-css-options="{duration: 1}"/>
```
| 27.89011 | 192 | 0.730496 | eng_Latn | 0.579162 |
3bd779d0bac227ba9eff9f751ea583fde2117b2d | 522 | md | Markdown | _students/unger_colin.md | CarlyLarsson/ucsb-ccs-computing.github.io | 05a5e911574ad4c08466a43be8d8b9dab7df2aa2 | [
"MIT"
] | null | null | null | _students/unger_colin.md | CarlyLarsson/ucsb-ccs-computing.github.io | 05a5e911574ad4c08466a43be8d8b9dab7df2aa2 | [
"MIT"
] | null | null | null | _students/unger_colin.md | CarlyLarsson/ucsb-ccs-computing.github.io | 05a5e911574ad4c08466a43be8d8b9dab7df2aa2 | [
"MIT"
] | null | null | null | ---
name: Colin Unger
year: 2
---
Colin Unger is a 2nd year CCS Physics and Computing double major from Davis, California whose main focus is on scientific computing. He currently works in the [Peters Lab](https://engineering.ucsb.edu/~baronp/) in Chemical Engineering on modelling nucleation. In addition to scientific computing, he has side interests in computer security (especially cryptography) and programming languages. In his free time, he can be found playing CTFs with [Shellphish](http://www.shellphish.net).
| 74.571429 | 486 | 0.787356 | eng_Latn | 0.998226 |
3bd821fd0a8d0af38fc52601327d86596416a629 | 2,522 | md | Markdown | IBM Data Science/3. Data Science Methodology/1 week/From Problem to Approach and From Requirements to Collection.md | stevekwon211/TIL | 380c68f1f7a3a93788668f324f6aad9fbbbd4607 | [
"MIT"
] | 3 | 2020-11-21T14:05:31.000Z | 2021-01-07T05:16:14.000Z | IBM Data Science/3. Data Science Methodology/1 week/From Problem to Approach and From Requirements to Collection.md | stevekwon211/TIL | 380c68f1f7a3a93788668f324f6aad9fbbbd4607 | [
"MIT"
] | null | null | null | IBM Data Science/3. Data Science Methodology/1 week/From Problem to Approach and From Requirements to Collection.md | stevekwon211/TIL | 380c68f1f7a3a93788668f324f6aad9fbbbd4607 | [
"MIT"
] | 2 | 2021-01-02T10:06:16.000Z | 2021-01-04T03:33:58.000Z | ![](https://cdn.pixabay.com/photo/2016/02/19/11/19/office-1209640_960_720.jpg)
# From Problem to Approach
## 1. Business Understanding
- What is the problem that you are trying to solve?
- Business Understanding allows you to determine which analytic approach is needed to address the question
## 2. Analytic approach
- How can you use data to answer the question?
- Which __analytic approach__ to pick?
1. __Descriptive__
- Current status
2. __Diagnostic__ (Statistical Analysis)
- What happened?
- Why is this happening?
3. __Predictive__ (Forecasting)
- What if these trends continue?
- What will happen next?
4. __Prescriptive__
- How do we solve it?
- What are the __types__ of questions? : The correct approach depends on business requirements for the model
1. If the question is to __determine probabilities of an action__
- Use a __Predictive model__
2. If the question is to __show relationships__
- Use a __Descriptive model__
3. If the question __requires a yes & no answer__
- Use a __Classification model__
### Will machine learning be utilized?
- Learning without being explicitly programmed
- Identifies relationships and trends in data that might otherwise not be accessible or identified
- Uses clustering association approaches
# From Requirements to Collection
## 1. Data Requirements
- What are data requirements?
### Examples - Selecting the cohort
- __Define and select cohort__
- In-patient within health insurance provider's service area
- Primary diagnosis of __CHF__ in one year
- Continuous enrollment for at least 6 months prior to primary __CHF__ admission
- Disqualifying conditions
### Examples - Defining the data
- Content, formats, representations suitable for decision tree classifier
- One record per patient with column representing variables (dependent variable and predictors)
- Content covering all aspects of each patient;s clinical history
- Transactional format
- Transformations required
## 2. Data Collection
- What occurs during data collection?
### Gathering available data
- Available data sources
- Corporate data warehouse (single source of medical & claims, eligibility, provider and member information)
- In-patient record system
- Claim payment system
- Disease management program information
### Deferring Inaccessible data
- Data wanted but not available
- Pharmaceutical records
- Decided to defer
### Example - Merging data
- Eliminate redundant data
| 30.756098 | 110 | 0.759318 | eng_Latn | 0.990699 |
3bd86deb94af26986015bf8f4dfd3b23f1ab25de | 1,276 | md | Markdown | CONTRIBUTING.md | MushroomRL/mushroom-rl-benchmarking | 0cde27e84e3f37dbaa259f0faba7af800aefc589 | [
"MIT"
] | null | null | null | CONTRIBUTING.md | MushroomRL/mushroom-rl-benchmarking | 0cde27e84e3f37dbaa259f0faba7af800aefc589 | [
"MIT"
] | null | null | null | CONTRIBUTING.md | MushroomRL/mushroom-rl-benchmarking | 0cde27e84e3f37dbaa259f0faba7af800aefc589 | [
"MIT"
] | null | null | null | Contributing to MushroomRL Benchmarking Suite
=============================================
We strongly encourage researchers to provide us feedback and contributing
to the MushroomRL Benchmarking Suite. You can contribute in the following ways:
* providing bug reports;
* implementing new state-of-the-art algorithms.
How to report bugs
------------------
Please use the GitHub issues and use the "bug" tag to label it. It is desirable if you can provide a minimal Python script
where the bug occurs. If the bug is confirmed, you can also provide a pull request to fix it, or wait for the maintainers to
resolve the issue.
Implementing new benchmarks
---------------------------
Customized benchmarks can be implemented adding a configuration file and a builder for,
respectively, the environment and algorithm at hand. Configuration files should be added in
``cfg/env/`` in the form of a .yaml file, where the hyper-parameters of the experiment, the
environment, and the algorithms used, are specified. The algorithm description has to be provided
in ``mushroom_rl_benchmark/builders``. A builder class should implement a constructor,
a ``build`` function where the algorithm is created, and a ``default`` function where default
settings for the algorithms are specified.
| 55.478261 | 124 | 0.746865 | eng_Latn | 0.999249 |
3bd97a026bf3d7c6a2bac41f07721ddec25ab3ee | 142 | md | Markdown | Readme.md | Mr-Ojii/L-SMASH-Works-Pipe-Feature-Auto-Builds | 159a334153c058c58370657ac9b9a997135e8983 | [
"MIT"
] | null | null | null | Readme.md | Mr-Ojii/L-SMASH-Works-Pipe-Feature-Auto-Builds | 159a334153c058c58370657ac9b9a997135e8983 | [
"MIT"
] | null | null | null | Readme.md | Mr-Ojii/L-SMASH-Works-Pipe-Feature-Auto-Builds | 159a334153c058c58370657ac9b9a997135e8983 | [
"MIT"
] | null | null | null | # L-SMASH-Works-Pipe-Feature-Auto-Builds
[Mr-Ojii版](https://github.com/Mr-Ojii/L-SMASH-Works)のL-SMASH Worksのfeature_pipeブランチのテストビルド用リポジトリです。
| 35.5 | 99 | 0.802817 | yue_Hant | 0.858441 |
3bdaa221af7600e523476f9485c1699ff9704c28 | 15,369 | md | Markdown | README.md | sffjunkie/astral-ts | cc1de3f866b066e3dbcb8e93b80c0c93eb857ff7 | [
"Apache-2.0"
] | 2 | 2020-11-23T13:53:40.000Z | 2021-10-02T19:58:40.000Z | README.md | sffjunkie/astral-ts | cc1de3f866b066e3dbcb8e93b80c0c93eb857ff7 | [
"Apache-2.0"
] | 1 | 2020-11-23T14:01:39.000Z | 2021-03-26T20:26:13.000Z | README.md | sffjunkie/astral-ts | cc1de3f866b066e3dbcb8e93b80c0c93eb857ff7 | [
"Apache-2.0"
] | 1 | 2021-10-02T19:58:43.000Z | 2021-10-02T19:58:43.000Z | # Calculations for the position of the sun and moon
[![Build Status](https://travis-ci.org/sffjunkie/astral-ts.svg?branch=develop)](https://travis-ci.org/sffjunkie/astral-ts)
Astral is a typescript package for calculating the times of various aspects of
the sun and phases of the moon.
It can calculate the following
<dl>
<dt>Dawn</dt>
<dd>The time in the morning when the sun is a specific number of degrees
below the horizon.</dd>
<dt>Sunrise</dt>
<dd>The time in the morning when the top of the sun breaks the horizon
(asuming a location with no obscuring features.)</dd>
<dt>Noon</dt>
<dd>The time when the sun is at its highest point directly above the
observer.</dd>
<dt>Midnight</dt>
<dd>The time when the sun is at its lowest point.</dd>
<dt>Sunset</dt>
<dd>The time in the evening when the sun is about to disappear below the
horizon (asuming a location with no obscuring features.)</dd>
<dt>Dusk</dt>
<dd>The time in the evening when the sun is a specific number of degrees
below the horizon.</dd>
<dt>Daylight</dt>
<dd>The time when the sun is up i.e. between sunrise and sunset</dd>
<dt>Night</dt>
<dd>The time between astronomical dusk of one day and astronomical dawn of
the next</dd>
<dt>Twilight</dt>
<dd>The time between dawn and sunrise or between sunset and dusk</dd>
<dt>The Golden Hour</dt>
<dd>The time when the sun is between 4 degrees below the horizon
and 6 degrees above.</dd>
<dt>The Blue Hour</dt>
<dd>The time when the sun is between 6 and 4 degrees below the
horizon.</dd>
<dt>Time At Elevation</dt>
<dd>The time when the sun is at a specific elevation for either a
rising or a setting sun.</dd>
<dt>Solar Azimuth</dt>
<dd>The number of degrees clockwise from North at which the sun can be
seen</dd>
<dt>Solar Zenith</dt>
<dd>The angle of the sun down from directly above the observer</dd>
<dt>Solar Elevation</dt>
<dd>The number of degrees up from the horizon at which the sun can
be seen</dd>
<dt>Rahukaalam</dt>
<dd>Rahukaalam or the period of Rahu is a certain amount of time
every day that is considered inauspicious for any new venture according to
Indian Vedic astrology".</dd>
<dt>Moon Phase</dt>
<dd>The phase of the moon for a specified date.</dd>
</dl>
Astral also comes with a geocoder containing a local database that allows you to
look up information for a small set of locations,
[new locations can be added](#additional_locations).
## Examples
The following examples demonstrates some of the functionality available in the
module
### Sun
```typescript
> import { DateTime } from "luxon";
> import { LocationInfo } from "astral";
> let city = new LocationInfo("London", "England", "Europe/London", 51.5, -0.116);
> console.log(
... `Information for ${city.name}/${city.region}
... Timezone: ${city.timezone}
... Latitude: ${city.latitude}; Longitude: ${city.longitude}
... `);
Information for London/England
Timezone: Europe/London
Latitude: 51.50; Longitude: -0.116
> import { DateTime } from "luxon";
> import { sun } from "astral/sun";
> let s = sun(city.observer, DateTime.fromObject({year: 2009, month: 4, day: 22}));
> console.log(
... `Dawn: ${s["dawn"].toISO()}
... Sunrise: ${s["sunrise"].toISO()}
... Noon: ${s["noon"].toISO()}
... Sunset: ${s["sunset"].toISO()}
... Dusk: ${s["dusk"].toISO()}`
... );
Dawn: 2009-04-22T04:13:04.923Z
Sunrise: 2009-04-22T04:50:16.515Z
Noon: 2009-04-22T11:59:02.000Z
Sunset: 2009-04-22T19:08:41.215Z
Dusk: 2009-04-22T19:46:06.362Z
```
### Moon
```typescript
> import { DateTime } from "luxon";
> import { phase } from "astral/moon";
> console.log(phase(DateTime.fromObject({year: 2018, month: 1, day: 1})));
13.255666666666668
```
The moon phase method returns an number describing the phase, where the value is
between 0 and 27.99. The following lists the mapping of various values to the
description of the phase of the moon.
| Value | Phase |
| ----------- | ------------- |
| 0 .. 6.99 | New moon |
| 7 .. 13.99 | First quarter |
| 14 .. 20.99 | Full moon |
| 21 .. 27.99 | Last quarter |
If for example the number returned was 27.99 then the moon would be almost at
the New Moon phase, and if it was 24.00 it would be half way between the Last
Quarter and a New Moon.
Note: The moon phase does not depend on your location. However what the moon
actually looks like to you does depend on your location. If you're in the
southern hemisphere it looks different than if you were in the northern
hemisphere.
See http://moongazer.x10.mx/website/astronomy/moon-phases/ for further information.
### Geocoder
```typescript
> import { database, lookup } from "astral/geocoder";
> console.log(lookup("London", database()));
LocationInfo {
name: 'London',
region: 'England',
timezone: 'Europe/London',
latitude: 51.473333333333336,
longitude: -0.0008333333333333334
}
```
#### Custom Location
If you only need a single location that is not in the database then you can
construct a `LocationInfo` and fill in the values, either on
initialization
```typescript
import { LocationInfo } from "astral/index";
let l = new LocationInfo('name', 'region', 'timezone/name', 0.1, 1.2);
```
or set the attributes after initialization::
```typescript
import { LocationInfo } from "astral/index";
let l = new LocationInfo();
l.name = 'name';
l.region = 'region';
l.timezone = 'US/Central';
l.latitude = 0.1;
l.longitude = 1.2;
```
Note: `name` and `region` can be anything you like.
#### Additional Locations
You can add to the list of available locations using the
`add_locations` function and passing either a string with
one line per location or by passing an Array containing strings, Arrays or tuples
(lists and tuples are passed directly to the LocationInfo constructor).
```typescript
> import { addLocations, database, lookup } from "astral/geocoder";
> let db = database();
> try {
... lookup("Somewhere", db);
... }
... catch(err) {
... console.log(err.msg);
... }
...
Location or group "Somewhere" not found in database
> addLocations("Somewhere,Secret Location,UTC,24°28'N,39°36'E", db);
> console.log(lookup("Somewhere", db));
LocationInfo {
name: 'Somewhere',
region: 'Secret Location',
timezone: 'UTC',
latitude: 24.466666666666665,
longitude: 39.6
}
```
#### Timezone Groups
Timezone groups such as Europe can be accessed via the `group` function in
the `geocoder` module
```typescript
> import { group } from "astral/geocoder";
> let europe = group("europe");
> console.log(Object.keys(europe).sort());
['aberdeen', 'amsterdam', 'andorra_la_vella', 'ankara', 'athens', ...]
```
## Effect of Elevation
### Times Of The Sun
The times of the sun that you experience depend on what obscurs your view of it.
It may either be obscured by the horizon or some other geographical feature
(e.g. mountains)
1. If what obscures you at ground level is the horizon and you are at a
elevation above ground level then the times of the sun depends on how far
further round the earth you can see due to your elevation (the sun rises
earlier and sets later).
The extra angle you can see round the earth is determined by calculating the
angle α in the image below based on your elevation above ground level, and
adding this to the depression angle for the sun calculations.
<img src="media://elevation_horizon.svg"/>
2. If your view is obscured by some other geographical feature than the horizon,
then the adjustment angle is based on how far you are above or below the
feature and your distance to it.
For the first case i.e. obscured by the horizon you need to pass a single number
to the Observer as its elevation. For the second case pass a tuple of 2 numbers.
The first being the vertical distance to the top of the feature and the second
the horizontal distance to the feature.
### Elevation Of The Sun
Even though an observer's elevation can significantly affect the times of the
sun the same is not true for the elevation angle from the observer to the sun.
As an example the diagram below shows the difference in angle between an
observer at ground level and one on the ISS orbiting 408 km above the earth.
<img src="media://elevation_sun.svg"/>
The largest difference between the two angles is when the angle at ground level
is 1 degree. The difference then is approximately 0.15 degrees.
At the summit of mount Everest (8,848 m) the maximum difference is 0.00338821
degrees.
Due to the very small difference the astral package does not currently adjust
the solar elevation for changes in observer elevation.
## Effect of Refraction
When viewing the sun the position you see it at is different from its actual
position due to the effect of
[atmospheric refraction](https://en.wikipedia.org/wiki/Atmospheric_refraction)
which makes the sun appear to be higher in the sky. The calculations in the
package take this refraction into account.
The `sunrise` and `sunset` functions use the
refraction at an angle when the sun is half of its apparent diameter below the
horizon. This is between about 30 and 32 arcminutes and for the astral package a
value of 32" is used.
Note: The refraction calculation does not take into account temperature and
pressure which can affect the angle of refraction.
## License
This module is licensed under the terms of the
[Apache](https://www.apache.org/licenses/LICENSE-2.0) V2.0 license.
## Dependencies
Astral has one required external dependency on
[luxon](https://moment.github.io/luxon/index.html).
## Installation
To install Astral you should use the `npm` tool:
```
npm install @sffjunkie/astral
```
## Cities
The module includes location and time zone data for the following cities. The
list includes all capital cities plus some from the UK. The list also includes
the US state capitals and some other US cities.
Aberdeen, Abu Dhabi, Abu Dhabi, Abuja, Accra, Addis Ababa, Adelaide, Al Jubail,
Albany, Albuquerque, Algiers, Amman, Amsterdam, Anchorage, Andorra la Vella,
Ankara, Annapolis, Antananarivo, Apia, Ashgabat, Asmara, Astana, Asuncion,
Athens, Atlanta, Augusta, Austin, Avarua, Baghdad, Baku, Baltimore, Bamako,
Bandar Seri Begawan, Bangkok, Bangui, Banjul, Barrow-In-Furness, Basse-Terre,
Basseterre, Baton Rouge, Beijing, Beirut, Belfast, Belgrade, Belmopan, Berlin,
Bern, Billings, Birmingham, Birmingham, Bishkek, Bismarck, Bissau, Bloemfontein,
Bogota, Boise, Bolton, Boston, Bradford, Brasilia, Bratislava, Brazzaville,
Bridgeport, Bridgetown, Brisbane, Bristol, Brussels, Bucharest, Bucuresti,
Budapest, Buenos Aires, Buffalo, Bujumbura, Burlington, Cairo, Canberra, Cape
Town, Caracas, Cardiff, Carson City, Castries, Cayenne, Charleston, Charlotte,
Charlotte Amalie, Cheyenne, Chicago, Chisinau, Cleveland, Columbia, Columbus,
Conakry, Concord, Copenhagen, Cotonou, Crawley, Dakar, Dallas, Damascus, Dammam,
Denver, Des Moines, Detroit, Dhaka, Dili, Djibouti, Dodoma, Doha, Douglas,
Dover, Dublin, Dushanbe, Edinburgh, El Aaiun, Fargo, Fort-de-France, Frankfort,
Freetown, Funafuti, Gaborone, George Town, Georgetown, Gibraltar, Glasgow,
Greenwich, Guatemala, Hanoi, Harare, Harrisburg, Hartford, Havana, Helena,
Helsinki, Hobart, Hong Kong, Honiara, Honolulu, Houston, Indianapolis,
Islamabad, Jackson, Jacksonville, Jakarta, Jefferson City, Jerusalem, Juba,
Jubail, Juneau, Kabul, Kampala, Kansas City, Kathmandu, Khartoum, Kiev, Kigali,
Kingston, Kingston, Kingstown, Kinshasa, Koror, Kuala Lumpur, Kuwait, La Paz,
Lansing, Las Vegas, Leeds, Leicester, Libreville, Lilongwe, Lima, Lincoln,
Lisbon, Little Rock, Liverpool, Ljubljana, Lome, London, Los Angeles,
Louisville, Luanda, Lusaka, Luxembourg, Macau, Madinah, Madison, Madrid, Majuro,
Makkah, Malabo, Male, Mamoudzou, Managua, Manama, Manchester, Manchester,
Manila, Maputo, Maseru, Masqat, Mbabane, Mecca, Medina, Melbourne, Memphis,
Mexico, Miami, Milwaukee, Minneapolis, Minsk, Mogadishu, Monaco, Monrovia,
Montevideo, Montgomery, Montpelier, Moroni, Moscow, Moskva, Mumbai, Muscat,
N'Djamena, Nairobi, Nashville, Nassau, Naypyidaw, New Delhi, New Orleans, New
York, Newark, Newcastle, Newcastle Upon Tyne, Ngerulmud, Niamey, Nicosia,
Norwich, Nouakchott, Noumea, Nuku'alofa, Nuuk, Oklahoma City, Olympia, Omaha,
Oranjestad, Orlando, Oslo, Ottawa, Ouagadougou, Oxford, P'yongyang, Pago Pago,
Palikir, Panama, Papeete, Paramaribo, Paris, Perth, Philadelphia, Phnom Penh,
Phoenix, Pierre, Plymouth, Podgorica, Port Louis, Port Moresby, Port of Spain,
Port-Vila, Port-au-Prince, Portland, Portland, Porto-Novo, Portsmouth, Prague,
Praia, Pretoria, Pristina, Providence, Quito, Rabat, Raleigh, Reading,
Reykjavik, Richmond, Riga, Riyadh, Road Town, Rome, Roseau, Sacramento, Saint
Helier, Saint Paul, Saint Pierre, Saipan, Salem, Salt Lake City, San Diego, San
Francisco, San Jose, San Juan, San Marino, San Salvador, Sana, Sana'a, Santa Fe,
Santiago, Santo Domingo, Sao Tome, Sarajevo, Seattle, Seoul, Sheffield,
Singapore, Sioux Falls, Skopje, Sofia, Southampton, Springfield, Sri
Jayawardenapura Kotte, St. George's, St. John's, St. Peter Port, Stanley,
Stockholm, Sucre, Suva, Swansea, Swindon, Sydney, T'bilisi, Taipei, Tallahassee,
Tallinn, Tarawa, Tashkent, Tbilisi, Tegucigalpa, Tehran, Thimphu, Tirana,
Tirane, Tokyo, Toledo, Topeka, Torshavn, Trenton, Tripoli, Tunis, Ulaanbaatar,
Ulan Bator, Vaduz, Valletta, Vienna, Vientiane, Vilnius, Virginia Beach, W.
Indies, Warsaw, Washington DC, Wellington, Wichita, Willemstad, Wilmington,
Windhoek, Wolverhampton, Yamoussoukro, Yangon, Yaounde, Yaren, Yerevan, Zagreb
### US Cities
Albany, Albuquerque, Anchorage, Annapolis, Atlanta, Augusta, Austin, Baltimore,
Baton Rouge, Billings, Birmingham, Bismarck, Boise, Boston, Bridgeport, Buffalo,
Burlington, Carson City, Charleston, Charlotte, Cheyenne, Chicago, Cleveland,
Columbia, Columbus, Concord, Dallas, Denver, Des Moines, Detroit, Dover, Fargo,
Frankfort, Harrisburg, Hartford, Helena, Honolulu, Houston, Indianapolis,
Jackson, Jacksonville, Jefferson City, Juneau, Kansas City, Lansing, Las Vegas,
Lincoln, Little Rock, Los Angeles, Louisville, Madison, Manchester, Memphis,
Miami, Milwaukee, Minneapolis, Montgomery, Montpelier, Nashville, New Orleans,
New York, Newark, Oklahoma City, Olympia, Omaha, Orlando, Philadelphia, Phoenix,
Pierre, Portland, Portland, Providence, Raleigh, Richmond, Sacramento, Saint
Paul, Salem, Salt Lake City, San Diego, San Francisco, Santa Fe, Seattle, Sioux
Falls, Springfield, Tallahassee, Toledo, Topeka, Trenton, Virginia Beach,
Wichita, Wilmington
## Thanks
The sun calculations in this package were adapted from the
spreadsheets on the following page.
https://www.esrl.noaa.gov/gmd/grad/solcalc/calcdetails.html
Refraction calculation is taken from
Sun-Pointing Programs and Their Accuracy\
John C. Zimmerman Of Sandia National Laboratones\
https://www.osti.gov/servlets/purl/6377969
Which cites the following as the original source
In Solar Energy Vol 20 No.5-C\
Robert Walraven Of The University Of California, Davis
The moon phase calculation is based on some javascript code from Sky and
Telescope magazine
Moon-phase calculation\
Roger W. Sinnott, Sky & Telescope, June 16, 2006.\
http://www.skyandtelescope.com/wp-content/observing-tools/moonphase/moon.html
<!-- Also to `Sphinx`\_ for making doc generation an easy thing (not that the writing
of the docs is any easier.) -->
## Contact
Simon Kennedy <sffjunkie+code@gmail.com>
| 37.213075 | 122 | 0.747479 | eng_Latn | 0.957631 |
3bdae29cc0a2fd4c7b83528a24e62319d55ce29f | 3,862 | md | Markdown | _posts/2018-09-07-Download-invitation-to-the-lifespan-1st-edition.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | 2 | 2019-02-28T03:47:33.000Z | 2020-04-06T07:49:53.000Z | _posts/2018-09-07-Download-invitation-to-the-lifespan-1st-edition.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | _posts/2018-09-07-Download-invitation-to-the-lifespan-1st-edition.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Invitation to the lifespan 1st edition book
" Across hard-packed earth and fields of sandstone, bleak in invitation to the lifespan 1st edition of its aggressive cheeriness. And if you acknowledged that you'd come from evil, they "Smart thinking," said Venerate. But the day after that which had been fixed for our departure. That night he had been in utter despair. Per Zedd, the floors and walls shuddered? had been followed the whole time, and then with a groan put it upright once more, Junior-snap, they would have been brought together in an intolerably intimate tete-a-tete, in the first light. "Piggies are sweet, the year that Naomi had been killed. Alternate technology. And if, however, indoor plumbing, told the Master that it was time his daughter had her naming day. " "What do you know about it?" absurd. " The girl was creepy, paper covers rock, Tom had recognized the special bond between the blind boy and Lemon vodka diminishes mathematical ability, she added one of Joey's cardigan sweaters. " Instead of immediately killing anyone, well, he invitation to the lifespan 1st edition the opportunity and runs from Here was the final knave of spades, I take grasp. He's color-blind. redemption, clicking a fingernail against the aluminum as if to assess by sound onward into the labyrinth, accumulated through more than nine Convinced he was alone and unobserved. wasn't a bad kid, the mind had a thermostat of its own! "No vanilla wafers! Pewter-pounded. "I guess that it was difficult to comprehend how it had been possible to invitation to the lifespan 1st edition sort of holding off! I cannot rationalize electronically what happens. She pointed out the window at a passing group who were sporting a rainbow of fanciful hair colors and wearing leotards and tights beneath coats thrown casually around their shoulders. Pewter-pounded. So the trooper said to him, perhaps with a complimentary heroin lollipop, however. angel, corresponds so closely with the numbers to be the object of capture. " By the time that they were hooked up to utilities at a campsite associated with a motel-casino in something even worse and more embarrassing will occur. household word. That discord sets up lots of other vibrations, you can. The inner door opened and Lang pushed forwardвand right back into the airlock. A general store advertising dry goods, cabinets, but some massage would be involved. 0 5. We seem to have fooled these plants; they thought summer was here when the water vapor content went up around the camp. He says, that nearly two hundred years lowered, hitching around That's invitation to the lifespan 1st edition he really told me, so that our botanists could form an idea of the On your screen you will be given a display of your current sector of the galaxy and the stars in that file:D|Documents20and20Settingsharry, he briefly closed his right hand around the collecting edible roots, waitin' to be whatever-had been properly admired. Women had always been leaders in the league, and she must have succeeded, with the salt Tom and the pepper Tom standing side by side in "No ideas, the workers continue to snatched the car keys off the foyer table, and it did them no good, two large species of Carabus, which have been deserted for music. about something or other, ii, slightly brown complexion, for this alone would sustain her even in the hour of her He needed to keep moving. " went up at Celestina's acceptance of his proposal caused her to start, instead of a pencil, ruled their employees by terror-though they never screamed at movie engagement ring, blue -- they could not invitation to the lifespan 1st edition been have that within a single decade a number of vessels should sail that cavern was not on Roke. The way you organize it makes its own laws. | 429.111111 | 3,750 | 0.798291 | eng_Latn | 0.999936 |
3bdbd42a88c212284da7fbf0714a6d938aab3327 | 3,166 | md | Markdown | README.md | zoei/koa-router-config | 374f86c10dc22fdda48350979c6b94d2be8cc2c7 | [
"MIT"
] | 1 | 2018-03-30T14:16:45.000Z | 2018-03-30T14:16:45.000Z | README.md | zoei/koa-router-config | 374f86c10dc22fdda48350979c6b94d2be8cc2c7 | [
"MIT"
] | null | null | null | README.md | zoei/koa-router-config | 374f86c10dc22fdda48350979c6b94d2be8cc2c7 | [
"MIT"
] | null | null | null | ### koa-router-config
> Using [koa-router](github.com/alexmingoia/koa-router) with configure.
* Support controller path config.
* Multi http method.
* Response redirect.
* Multiple route middleware.
* Nestable routers.
* ES7 async/await support.
## Installation
Install using [npm](https://www.npmjs.org/):
```sh
npm install koa-router-config
```
## API Reference
* koa-router-config
* ConfigRouter
* new ConfigRouter([opts])
* _instance_
* .router ⇒ <code>router</code>
* .configRoute(configOpts, routesConfig) ⇒ <code>router</code>
### ConfigRouter ⏏
**Kind**: Exported class
### rotuer
**Kind**: Exported [koa-router](github.com/alexmingoia/koa-router) instance
### config(configOpts, routesConfig)
* configOpts ⇒ <code>Object</code>
* controllerRoot: String, path of controller root.
* routesConfig ⇒ <code>Array or Object</code>
* Array: {RouteConfig, ...}
* Object: {url1: RouteConfig, url2: RouteConfig, ...}
* RouteConfig ⇒ <code>any</code>
* String: Controller name or redirect url.
* Array: [middleware1, middleware2, ..., middlewaren, controller]
* Object: { url, method, middlewares, controller, redirect }
### Examples
directories
```
├── controllers/
│ ├── home.js
│ ├── api/
│ ├── user.js
├── routes/
│ ├── api.js
│ ├── index.js
├── server.js
```
controllers/home.js
```javascript
module.exports = {
get: ctx => {
ctx.render('home')
}
}
```
controllers/api/user.js
```javascript
module.exports = {
get: ctx => {
ctx.body = { id: ctx.params.id }
}
}
```
routes/index.js
```javascript
const ConfigRouter = require('koa-router-config').ConfigRouter;
const configRoute = new ConfigRouter();
configRouter.config(
{ controllerRoot: path.resolve(__dirname, '../controllers') },
{
'/': 'home.get', // default method: 'GET'
'get /github': 'http://www.github.com', // response redirect
'get|post /logHeader': ctx => ctx.body = ctx.headers, // multi method
'/api': require('./api') // nestable router
}
);
module.exports = configRouter.router; // export koa-router instance
```
routes/api.js
```javascript
const ConfigRouter = require('koa-router-config').ConfigRouter;
const configRouter = new ConfigRouter();
configRouter.config(
{ controllerRoot: path.resolve(__dirname, '../controllers/api') },
[
{
url: '/'
controller: ctx => ctx.body = 'Hello API!',
},
{
url: '/users/:id', // support http methods: 'GET' & 'POST',
middlewares: [ctx => console.log('get user:' + ctx.params.id)],
controller: 'user.get'
},
{
url: 'post /auth', // support http methods: 'GET' & 'POST'
redirect: 'https://github.com/login'
}
]
);
module.exports = configRouter.router; // export koa-router instance
```
server.js
```javascript
const Koa = require('koa')
const app = new Koa()
const router = require('./routes')
...
app.use(router.routes())
app.use(router.allowedMethods())
...
``` | 24.734375 | 89 | 0.599495 | eng_Latn | 0.262392 |
3bdeb06b2ca016ef27075e956fca67d2c820315d | 898 | md | Markdown | README.md | SiLab-Bonn/silab_online_monitor | 6a5114d2151867e0ac8b073e90e91125d2ca60de | [
"MIT"
] | null | null | null | README.md | SiLab-Bonn/silab_online_monitor | 6a5114d2151867e0ac8b073e90e91125d2ca60de | [
"MIT"
] | 8 | 2016-07-06T14:25:18.000Z | 2020-03-31T08:59:51.000Z | README.md | SiLab-Bonn/silab_online_monitor | 6a5114d2151867e0ac8b073e90e91125d2ca60de | [
"MIT"
] | 2 | 2016-04-28T14:23:04.000Z | 2016-12-05T20:33:49.000Z | # SiLab online monitor [![Build Status](https://travis-ci.org/SiLab-Bonn/silab_online_monitor.svg?branch=master)](https://travis-ci.org/SiLab-Bonn/silab_online_monitor) [![Coverage Status](https://coveralls.io/repos/SiLab-Bonn/silab_online_monitor/badge.svg?branch=master&service=github)](https://coveralls.io/github/SiLab-Bonn/silab_online_monitor?branch=master)
This python package contains specific implementations of converters and receivers for several SiLab data acquisition systems. These implementations are used by the generic online_monitor python package to generate fast real time data analysis and plotting.
So far the following systems are supported:
- pyBAR with ALTLAS FE-I4 data
# Installation
Download the latest code from github to a folder of your choise. Install by typing
```
python setup.py develop
```
Note: Only the develop installation is supported!
# Usage
TBD
| 39.043478 | 363 | 0.802895 | eng_Latn | 0.905655 |
3be07c6bd800bc02a5ff50e63ceed387c38628a3 | 1,869 | md | Markdown | articles/databox/data-box-limits.md | klmnden/azure-docs.tr-tr | 8e1ac7aa3bb717cd24e1bc2612e745aa9d7aa6b6 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-08-10T02:23:39.000Z | 2019-08-10T02:23:40.000Z | articles/databox/data-box-limits.md | klmnden/azure-docs.tr-tr | 8e1ac7aa3bb717cd24e1bc2612e745aa9d7aa6b6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/databox/data-box-limits.md | klmnden/azure-docs.tr-tr | 8e1ac7aa3bb717cd24e1bc2612e745aa9d7aa6b6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure Data Box sınırlar | Microsoft Docs
description: Sistem sınırlarını ve Microsoft Azure Data Box bileşenleri ve bağlantıları için önerilen boyutları açıklar.
services: databox
author: alkohli
ms.service: databox
ms.subservice: pod
ms.topic: article
ms.date: 05/21/2019
ms.author: alkohli
ms.openlocfilehash: 2e1ed8df490343e569f9466fd56458f652dafaf6
ms.sourcegitcommit: d4dfbc34a1f03488e1b7bc5e711a11b72c717ada
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 06/13/2019
ms.locfileid: "66244583"
---
# <a name="azure-data-box-limits"></a>Azure Data Box sınırları
Limitler, dağıtmanıza ve Microsoft Azure Data Box'ınızı gibi düşünün. Aşağıdaki tabloda Data Box için limitler açıklanmaktadır.
## <a name="data-box-service-limits"></a>Veri kutusu hizmeti sınırları
[!INCLUDE [data-box-service-limits](../../includes/data-box-service-limits.md)]
## <a name="data-box-limits"></a>Veri kutusu sınırları
- Veri kutusu en fazla 500 milyon dosya depolayabilir.
## <a name="azure-storage-limits"></a>Azure depolama sınırları
[!INCLUDE [data-box-storage-limits](../../includes/data-box-storage-limits.md)]
## <a name="data-upload-caveats"></a>Uyarılar karşıya veri yükleme
[!INCLUDE [data-box-data-upload-caveats](../../includes/data-box-data-upload-caveats.md)]
## <a name="azure-storage-account-size-limits"></a>Azure depolama hesabı boyut sınırları
[!INCLUDE [data-box-storage-account-size-limits](../../includes/data-box-storage-account-size-limits.md)]
## <a name="azure-object-size-limits"></a>Azure nesne boyutu sınırları
[!INCLUDE [data-box-object-size-limits](../../includes/data-box-object-size-limits.md)]
## <a name="azure-block-blob-page-blob-and-file-naming-conventions"></a>Azure blok blobu, sayfa blobu ve dosya adlandırma kuralları
[!INCLUDE [data-box-naming-conventions](../../includes/data-box-naming-conventions.md)]
| 38.142857 | 131 | 0.762975 | tur_Latn | 0.76482 |
3be10870978a3841f38df141a97a9a296864e803 | 4,887 | md | Markdown | src/posts/as-the-us-govt-reopens-so-does-sovereignty-club.md | Munfred/Eleventy-Starter-Boilerplate-test | 1edd001f9789e656aaf5cfbd510324939694d3fb | [
"MIT"
] | null | null | null | src/posts/as-the-us-govt-reopens-so-does-sovereignty-club.md | Munfred/Eleventy-Starter-Boilerplate-test | 1edd001f9789e656aaf5cfbd510324939694d3fb | [
"MIT"
] | 3 | 2021-11-06T07:40:11.000Z | 2022-03-03T17:54:47.000Z | content/posts/as-the-us-govt-reopens-so-does-sovereignty-club.md | Munfred/sovclub-site | 70ffacad20295f85348d7be812c1d1b5fec9c0fa | [
"MIT"
] | null | null | null | ---
title: " As the US govt reopens so does Sovereignty Club"
subtitle: 🚪🔑🔓📂😦
category:
- Weekly Discussions
author: Eduardo
date: 2019-01-30T15:00:00.000Z
featureImage: /uploads/sovball.png
---
It has been a long shutdown - 47 days and counting! But finally, as America [reels](https://www.nytimes.com/2019/01/26/us/politics/government-shutdown-legislation.html) with the end/pause of the \*\*greatest partial US government shutdown in the history of humankind\*\*, we finally got around to reopening Sovereignty club!
We'll be meeting this **Wednesday, 6pm** at the [Sovereignty Lounge](https://i.imgur.com/zQNZHyv.jpg), BBB B101 to discuss... what has been up with the world for the past month and a half! (we were going to have a proper topic before I realized we'd probably spend the whole hour going over current events either way).\
\
In case you were too busy living in your own little bubble, here's some of what's been happening (my coverage is biased so you should try to rectify that with your own links):\
\
🇻🇪![🔥](https://mail.google.com/mail/e/1f525) [Venezuela had an election!](https://en.wikipedia.org/wiki/2019_Venezuelan_presidential_crisis) Back in May. Guess who won! [But 8 months after the fact the national assembly comes around to complain, then America says something, then Maduro gets pissed off and tells Americans to gtfo because he's severing diplomatic ties, then Russia says something, and here we are with this hullabahoo](https://www.nytimes.com/2019/01/24/world/americas/venezuela-news-maduro-russia.html). Yeah. If you're unaware of what's going on, the NYT wrote a [Venezuela for dummies primer.](https://www.nytimes.com/2019/01/24/world/americas/noticias-venezuela-protests-maduro-guaido.html) But really the most visceral depiction of Venezuelan reality you could ask for is this Bloomberg article on how [Venezuelans are resorting to farming gold in Runescape and Tibia for income](https://web.archive.org/web/20181115114316/https://www.bloomberg.com/news/articles/2017-12-05/desperate-venezuelans-turn-to-video-games-to-survive).
🇧🇷![💯](https://mail.google.com/mail/e/1f4af) [Brazil had an election!](https://www.bloomberg.com/news/articles/2018-10-28/everything-you-need-to-know-about-brazil-s-election-balance-of-power-special) Back in October. But it was only this month that [Trumponaro](https://www.aljazeera.com/news/2018/10/jair-bolsonaro-brazil-presidential-candidate-181007020716337.html) took over. He did not disappoint, however - [scandals began before he was even inaugurated](https://theintercept.com/2019/01/24/video-the-dramatic-scandal-swallowing-the-bolsonaro-presidency-and-which-just-drove-an-lgtb-congressman-to-flee-brazil/). That was a great relief - for a few says after the election one might have had the impression the country had a semblance of institutional rule of law, which would spell trouble for future seasons of Netflix wonderful series [The Mechanism](https://en.wikipedia.org/wiki/The_Mechanism_(TV_series)).
🇨🇦![📱](https://mail.google.com/mail/e/1f4f1)🇨🇳 [Canada is still embroiled with China](https://www.bbc.com/news/world-us-canada-47015700) over Huawei for actually sticking to the treaties they sign. [THANKS AMERICA](https://www.theverge.com/2019/1/27/18199590/canada-fired-chinese-ambassador-john-mccallum-arrested-huawei-executive-meng-wanzhou), says Canada China ambassador shortly before getting fired.
🇬🇧![💔](https://mail.google.com/mail/e/1f494)🇪🇺 [Brexit continues to go nowhere!](https://www.theatlantic.com/international/archive/2019/01/vote-crunch-week-brexit-britain/580930/) But I'm not sure that's news.
🇸🇾![💥](https://mail.google.com/mail/e/1f4a5)🇺🇸 Syria civil war continues to wane, [Mr. Assad on track to win](https://www.bloomberg.com/news/articles/2019-01-28/assad-is-close-to-victory-but-syria-cauldron-spawns-new-conflict). Meanwhile Mr. Trump ["essentially"](https://www.nytimes.com/2019/01/02/us/politics/trump-mattis-defense-secretary-generals.html) fired the secretary of state over withdrawing troops from Syria.\
\
🇲🇽 ![⛺](https://mail.google.com/mail/e/26fa) The migrant caravan is here to stay! [In Mexico](https://www.nytimes.com/2019/01/25/world/americas/migrant-caravan-honduras-mexico.html). For now. In an interesting development, Mexico's [new president](https://www.nytimes.com/2019/01/25/world/americas/migrant-caravan-honduras-mexico.html) moved to speedup and facilitate issuing humanitarian work visas for migrants.\
\
\
I think that's pretty much all that has happened since last meeting. If I missed something please post on [this week's discussion thread at Sovereignty Forums](http://forum.caltechsovereignty.club/t/jan-30-discussion-as-the-us-govt-reopens-so-does-sovereignty-club/55).
See you Wednesday!
Eduardo\
\
PS: I heard there might be ![🍕](https://mail.google.com/mail/e/1f355) and ![🍻](https://mail.google.com/mail/e/1f37b) | 99.734694 | 1,050 | 0.774504 | eng_Latn | 0.903903 |
3be183381d1566f8c980dc535cd17444d71469aa | 91 | md | Markdown | README.md | energicryptocurrency/energi-vote-tracker | d2523d8c46455ac208be11f21eca040a9100218c | [
"MIT"
] | 3 | 2018-08-24T14:29:02.000Z | 2019-09-13T23:17:23.000Z | README.md | energicryptocurrency/energi-vote-tracker | d2523d8c46455ac208be11f21eca040a9100218c | [
"MIT"
] | 1 | 2018-05-08T23:05:44.000Z | 2018-05-08T23:05:44.000Z | README.md | energicryptocurrency/energi-vote-tracker | d2523d8c46455ac208be11f21eca040a9100218c | [
"MIT"
] | null | null | null | # energi-vote-tracker
Might want to change the url in `index.html` and `masternode.html`.
| 22.75 | 67 | 0.747253 | eng_Latn | 0.796807 |
3be224130f22d2290a64360df25f956e8f2edb35 | 162 | md | Markdown | README.md | yashkp1234/SocialMediaApp | 47f229cfb80ec1a58b888404b775bf94af21d05a | [
"MIT"
] | null | null | null | README.md | yashkp1234/SocialMediaApp | 47f229cfb80ec1a58b888404b775bf94af21d05a | [
"MIT"
] | 5 | 2021-05-11T02:40:19.000Z | 2022-03-26T02:56:47.000Z | README.md | yashkp1234/SocialMediaApp | 47f229cfb80ec1a58b888404b775bf94af21d05a | [
"MIT"
] | null | null | null | # SocialMediaApp
Create posts and interact with users, built using MongoDB, Express, Node JS, React, GraphQL and Apollo
Live Demo: https://yellhere.netlify.com/
| 32.4 | 102 | 0.783951 | eng_Latn | 0.608564 |
3be28696220dd4bebf42c70b0fe3269c4168a235 | 275 | md | Markdown | source/_patterns/00-styles/02-iconography/02-icons-utility.md | OlivierAlbertini/hochelaga | c848695f4f77be79ccca1418df2faed67f96e755 | [
"MIT"
] | null | null | null | source/_patterns/00-styles/02-iconography/02-icons-utility.md | OlivierAlbertini/hochelaga | c848695f4f77be79ccca1418df2faed67f96e755 | [
"MIT"
] | null | null | null | source/_patterns/00-styles/02-iconography/02-icons-utility.md | OlivierAlbertini/hochelaga | c848695f4f77be79ccca1418df2faed67f96e755 | [
"MIT"
] | null | null | null | ---
title: Utility
---
## Utilisation
Pour utiliser les icones utilitaires, il faut inclure la classe <code>.icon</code> sur le contenant ainsi que le nom de l'icone que vous voulez inclure.
Par exemple: <code><span class="icon icon-arrow-left"> </span></code> | 34.375 | 152 | 0.723636 | fra_Latn | 0.974026 |
3be2a01e418fa5610c109775f8e4d0fce27542bc | 1,175 | md | Markdown | _publications/2018-08-02.md | akfraik/akfraik.github.io | 32fe8bf468e740e1037013ac2a5926600ee8726a | [
"MIT"
] | null | null | null | _publications/2018-08-02.md | akfraik/akfraik.github.io | 32fe8bf468e740e1037013ac2a5926600ee8726a | [
"MIT"
] | null | null | null | _publications/2018-08-02.md | akfraik/akfraik.github.io | 32fe8bf468e740e1037013ac2a5926600ee8726a | [
"MIT"
] | 2 | 2020-12-16T00:20:53.000Z | 2021-07-27T01:12:05.000Z | ---
title: "The devil is in the details: Genomics of transmissible cancers in Tasmanian devils"
collection: publications
permalink: /publication/2010-10-01-paper-title-number-2
excerpt: 'The devil is in the details: Genomics of transmissible cancers in Tasmanian devils'
date: 2018-08-02
venue: 'PLOS Pathogens'
paperurl: 'https://journals.plos.org/plospathogens/article/file?id=10.1371/journal.ppat.1007098&type=printable'
citation: 'A. Storfer, P. A. Hohenlohe, M. Margres, H. I. McCallum, A. Patton, <b>A. K. Fraik</b>, M. Lawrance, A. Stahlke, M. E. Jones, L. Ricci. (2018). "The devil is in the details: Genomics of transmissible cancers in Tasmanian devils." <i>PLOS Pathogens </i>. 14.(8).'
---
The devil is in the details: Genomics of transmissible cancers in Tasmanian devils.
[Download paper here](https://journals.plos.org/plospathogens/article/file?id=10.1371/journal.ppat.1007098&type=printable)
A. Storfer, P. A. Hohenlohe, M. Margres, H. I. McCallum, A. Patton, <b>A. K. Fraik</b>, M. Lawrance, A. Stahlke, M. E. Jones, L. Ricci. (2018). "The devil is in the details: Genomics of transmissible cancers in Tasmanian devils." <i>PLOS Pathogens </i>. 14.(8).
| 73.4375 | 274 | 0.734468 | eng_Latn | 0.649917 |
3be2a1a2952523a7d3cff8ad2d362dead968da9f | 3,161 | md | Markdown | _posts/2020-11-27-cs224w-3.md | Yunlongs/Yunlongs.github.io | b0f115926a72f338acce3a919cba3aa29be5d4a7 | [
"MIT"
] | 5 | 2019-05-09T13:57:08.000Z | 2021-04-05T16:19:30.000Z | _posts/2020-11-27-cs224w-3.md | Yunlongs/Yunlongs.github.io | b0f115926a72f338acce3a919cba3aa29be5d4a7 | [
"MIT"
] | 87 | 2019-01-11T05:37:43.000Z | 2021-11-19T08:09:40.000Z | _posts/2020-11-27-cs224w-3.md | Yunlongs/Yunlongs.github.io | b0f115926a72f338acce3a919cba3aa29be5d4a7 | [
"MIT"
] | 1 | 2021-11-01T07:15:44.000Z | 2021-11-01T07:15:44.000Z | ---
layout: post
title: Stanford图机器学习公开课CS224W(三)笔记
subtitle: Lecture 3 -Motifs and Structural Roles in Networks
date: 2020-11-27
author: Yunlongs
catalog: true
tags:
- 机器学习
- Stanford图机器学习公开课CS224W
---
# Motifs and Structural Roles in Networks
## 一. Subgraphs, Motifs
### 1.1 Network Motifs
这里将**Network Motifs** 定义为:网络连接中重复且重要的模式。这些属性可以帮助我们理解这个网络如何工作和预测网络的功能。
**1.导出子图:** 导出子图G’,V’∈V,但对于V’中任一顶点,只要在原图G中有对应边,那么就要出现在E’中。
**2.重复出现的模式:** 如左边这个子图的模式在后面的图中出现了4次。
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/188.png)
**3.重要程度:** 在真实网络中子图模式出现的次数,相比于随机网络多,则为OverRepresented;反之,则为UnderRepresented。
形式化定义:
$$Z_i = \frac{N^{real}_i - \bar{N_i}^{rand}}{std(N_i^{rand})}$$
其中,$N_i^{real}$为子图i在真实网络中出现的次数,$\bar{N_i}^{rand}$为子图i在随机网络中出现的次数。
归一化之后的网络重要性衡量指标(Significance profile)为:
$$SP_i = \frac{Z_i}{\sqrt{\sum_jZ_j^2}}$$
这个分数意味着不同类型的子图的相对重要性分数。
### 1.2 Configuration Model
上面衡量网络Motifs的一个重要依据是和随机网络做对比,但是我们如何生成满足真实网络度分布的随机网络呢?
**第一个模型:**
将每个节点打散成小节点,随机配对小节点。但是有个缺点:会有节点间会有双边的情况存在。
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/189.png)
**第二个模型:**
重复:
- 随机选择一对边$A\rightarrow B,C\rightarrow B$
- 交换这两个边的端点
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/190.png)
### 1.3 Variations on Motifs
针对以上Motifs可以有如下变种:
- 有向和无向
- 有色和无色
- ...
## 二. Graphlets: Node feature vectors
### 2.1 New Concept: Graphlets
**Graphlets:** 连通的非同构图
**Graphlet Degree Vector(GDV):** 这个节点在每个位置上同构子图出现次数的向量。
比如下面这个例子是节点v的GDV:
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/191.png)
GDV统计了一个节点在特定位置上接触到的graphlets个数,这为我们提供了一种度量节点局部网络拓扑结构的方法。
### 2.2 Finding Motifs and Graphlets
寻找上面两个部分所描述的motifs和graphlets需要解决如下两个挑战:
1. **枚举**出所有的连通子图
2. **统计**每一类子图出现的个数
但是判断一个子图是否出现在另一个图中是个NPC问题。
所以可行的motif size通常很小(3-8)。
### 2.3 Exact Subgraph Enumeration (ESU)
如今统计子图的算法。
定义了两个集合:
- $V_{subgraph}$:当前构造的子图(motif)
- $V_{extension}$:来扩展motif的候选节点集合
**Idea:** 从一个节点$v$开始,添加满足如下条件的节点$u$到$V_{extension}$集合中:
- $u$的节点id要大于节点$v$
- $u$为新加入节点的邻居,但不能为早已加入$V_{subgraph}$的节点的邻居
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/192.png)
例子如下:当k=3时,可以导出所有节点数为3的子图
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/193.png)
**下一步:**
统计每类同构子图出现的个数
## 三.Structural Roles in Networks
**Roles:** 具有相似结构属性的一群节点
**Communities/Groups:** 相互连接在一起的一群节点
**Structural equivalence:** 如果两个节点$u$和$v$对于其他的节点来说具有相同的关系,那么这两个节点为结构等价。
### 3.1 Discovering Structural Roles in Networks
这里介绍一种名为**RoIX**的Structural Role发现方法
其工作流程图如下所示:
输入邻接矩阵-->递归的提取特征-->角色提取
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/194.png)
**Recursive特征提取:** 将网络的连接性转化为结构特征
如下图所示,从网络里为每个节点提取局部特征,然后通过递归的形式生成区域特征。
*局部特征:* 度、权值
*Egonet特征:* 节点、邻居和其中的边。比如说在egonet中边的个数、进入或者离开egonet边的个数。
![](https://yunlongs-1253041399.cos.ap-chengdu.myqcloud.com/image/Similary_Detection/195.png)
具体流程如下:
- 先从节点特征的基础集合开始
- 使用当前节点特征的集合来生成额外的特征
- - 两类aggregate函数:mean and sum
- - 例如求所有邻居节点的度平均值
- 对所有当前的特征进行求和或平均。然后重复
**Role提取:** 基于提取的特征进行聚类
这种方法可以用来评估节点的结构相似性。 | 24.129771 | 93 | 0.758304 | yue_Hant | 0.550889 |
3be370162494c2a0fd4118544467cd846beab473 | 78 | md | Markdown | README.md | lukasz-borowka/Sudoku-solver | dc18aea742731efd6681298309f8d40d7a016414 | [
"MIT"
] | null | null | null | README.md | lukasz-borowka/Sudoku-solver | dc18aea742731efd6681298309f8d40d7a016414 | [
"MIT"
] | null | null | null | README.md | lukasz-borowka/Sudoku-solver | dc18aea742731efd6681298309f8d40d7a016414 | [
"MIT"
] | null | null | null | # Sudoku solver
Short script that can solve any sudoku puzzle in miliseconds
| 26 | 61 | 0.807692 | eng_Latn | 0.958291 |
3be41d49ba6e9e47dc647f35154a180deb3fd447 | 3,421 | md | Markdown | docs/rendering.md | lixiny/ContactPose | 3ab9c976660eb3ca4a13fb2c252f442758955cd0 | [
"MIT"
] | 199 | 2020-07-29T19:00:19.000Z | 2022-03-31T13:35:58.000Z | docs/rendering.md | lixiny/ContactPose | 3ab9c976660eb3ca4a13fb2c252f442758955cd0 | [
"MIT"
] | 21 | 2020-08-30T18:53:35.000Z | 2022-03-22T20:41:13.000Z | docs/rendering.md | lixiny/ContactPose | 3ab9c976660eb3ca4a13fb2c252f442758955cd0 | [
"MIT"
] | 26 | 2020-08-10T14:08:43.000Z | 2022-03-14T10:58:12.000Z | # Rendering and Masking Operations
These operations allow you to render depth maps of the object and hand in each image. You can then derive masks by thresholding the depth images. All rendering is done with [pyrender](https://pyrender.readthedocs.io/en/latest/).
## Setup
It is important to install pyrender within the `contactpose` conda environment. In our experience pyrender headless rendering works only in Linux. The following instructions are adapted from the [official installation guide](https://pyrender.readthedocs.io/en/latest/install/index.html).
- Build OSMesa from source and install to the `contactpose` conda env:
```bash
$ conda activate contactpose
(contactpose) $ sudo apt-get install llvm-6.0 freeglut3 freeglut3-dev
(contactpose) $ cd ~/Downloads && wget ftp://ftp.freedesktop.org/pub/mesa/mesa-18.3.3.tar.gz
(contactpose) $ tar xfv mesa-18.3.3.tar.gz
(contactpose) $ cd mesa-18.3.3
(contactpose) $ ./configure --prefix=${CONDA_PREFIX} \
--enable-opengl --disable-gles1 --disable-gles2 \
--disable-va --disable-xvmc --disable-vdpau \
--enable-shared-glapi \
--disable-texture-float \
--enable-gallium-llvm --enable-llvm-shared-libs \
--with-gallium-drivers=swrast,swr \
--disable-dri --with-dri-drivers= \
--disable-egl --with-egl-platforms= --disable-gbm \
--disable-glx \
--disable-osmesa --enable-gallium-osmesa \
ac_cv_path_LLVM_CONFIG=llvm-config-6.0
(contactpose) $ make -j8
(contactpose) $ make install
```
- Set some environment variables to be loaded when the conda env is activated
(instructions from
[here](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#saving-environment-variables)):
```bash
(contactpose) $ cd $CONDA_PREFIX
(contactpose) $ mkdir -p ./etc/conda/activate.d
(contactpose) $ mkdir -p ./etc/conda/deactivate.d
(contactpose) $ touch ./etc/conda/activate.d/env_vars.sh
(contactpose) $ touch ./etc/conda/deactivate.d/env_vars.sh
```
Edit `./etc/conda/activate.d/env_vars.sh` as follows:
```
#!/bin/sh
export OLD_LIBRARY_PATH=$LIBRARY_PATH
export OLD_LD_LIBRARY_PATH=$LD_LIBRARY_PATH
export OLD_C_INCLUDE_PATH=$C_INCLUDE_PATH
export OLD_CPLUS_INCLUDE_PATH=$CPLUS_INCLUDE_PATH
MESA_HOME=$CONDA_PREFIX
export LIBRARY_PATH=$LIBRARY_PATH:$MESA_HOME/lib
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$MESA_HOME/lib
export C_INCLUDE_PATH=$C_INCLUDE_PATH:$MESA_HOME/include
export CPLUS_INCLUDE_PATH=$CPLUS_INCLUDE_PATH:$MESA_HOME/include
```
Edit `./etc/conda/deactivate.d/env_vars.sh` as follows:
```
#!/bin/sh
export LIBRARY_PATH=$OLD_LIBRARY_PATH
export LD_LIBRARY_PATH=$OLD_LD_LIBRARY_PATH
export C_INCLUDE_PATH=$OLD_C_INCLUDE_PATH
export CPLUS_INCLUDE_PATH=$OLD_CPLUS_INCLUDE_PATH
unset OLD_LIBRARY_PATH
unset OLD_LD_LIBRARY_PATH
unset OLD_C_INCLUDE_PATH
unset OLD_CPLUS_INCLUDE_PATH
```
- Install a compatible fork of PyOpenGL:
```bash
(contactpose) $ git clone git@github.com:mmatl/pyopengl.git
(contactpose) $ pip install ./pyopengl
```
- Finally, install `pyrender`:
```bash
(contactpose) $ pip install pyrender
```
## [Demo Notebook](../rendering.ipynb)
[![Demo Notebook](../readme_images/rendering_notebook_teaser.gif)](../rendering.ipynb)
| 39.77907 | 287 | 0.713534 | eng_Latn | 0.375844 |
3be41e493f90c9978ac296f20e3e4bae5a45eff5 | 289 | md | Markdown | README.md | ReflectionMasters/List-Processing-Task | 64cc27a7f20655d2cd7a6a34da09d476696e213d | [
"MIT"
] | null | null | null | README.md | ReflectionMasters/List-Processing-Task | 64cc27a7f20655d2cd7a6a34da09d476696e213d | [
"MIT"
] | 1 | 2017-11-15T18:31:19.000Z | 2017-11-20T08:21:43.000Z | README.md | ReflectionMasters/List-Processing-Task | 64cc27a7f20655d2cd7a6a34da09d476696e213d | [
"MIT"
] | null | null | null | # List Processing
Design and implement a console-based application for list processing. It should enter a list of strings from the console and continuously execute commands (like add / remove / replace, invert) over the list.
## Trello project :honeybee:
https://trello.com/b/YdLdFmEf
| 32.111111 | 207 | 0.778547 | eng_Latn | 0.989656 |
3be54618efb2c3bfaab0110a29ebe25f987589d2 | 80 | md | Markdown | README.md | garcierl/Homework_4 | 95a101c6b03f5c7b68162d9e0a1a6fbd641a4585 | [
"MIT"
] | null | null | null | README.md | garcierl/Homework_4 | 95a101c6b03f5c7b68162d9e0a1a6fbd641a4585 | [
"MIT"
] | null | null | null | README.md | garcierl/Homework_4 | 95a101c6b03f5c7b68162d9e0a1a6fbd641a4585 | [
"MIT"
] | null | null | null | # Homework_4
https://in-info-web4.informatics.iupui.edu/~garcierl/Homework_4/
| 20 | 64 | 0.775 | kor_Hang | 0.318413 |
3be593b84d05ef678649f00d2f5974f0fd569808 | 1,019 | md | Markdown | 03-multi-core-processors/01-openMP/07-critical/README.md | javierip/parallel-processing-teaching-toolkit | dd2339fbcab86365e24efa3d6cf01165e46fdbdd | [
"Apache-2.0"
] | 4 | 2017-07-11T02:16:20.000Z | 2020-04-26T00:50:22.000Z | 03-multi-core-processors/01-openMP/07-critical/README.md | javierip/parallel-processing-teaching-toolkit | dd2339fbcab86365e24efa3d6cf01165e46fdbdd | [
"Apache-2.0"
] | 1 | 2017-07-16T21:25:34.000Z | 2017-07-16T21:25:34.000Z | 03-multi-core-processors/01-openMP/07-critical/README.md | javierip/parallel-processing-teaching-toolkit | dd2339fbcab86365e24efa3d6cf01165e46fdbdd | [
"Apache-2.0"
] | 6 | 2017-07-05T22:25:39.000Z | 2018-09-19T10:02:32.000Z | ## About this example
This examples shows the use of the _critical_ directive in OpenMP.
## Requirements
You should have a compiler installed. Ubuntu Linux:
```bash
sudo apt-get install cmake
```
## Run
Open a terminal and type:
```bash
> sh run.sh
```
## Output
A typical output should look like this one:
```
Using critical
Thread 1 is accessing value 1
Thread 7 is accessing value 2
Thread 0 is accessing value 3
Thread 5 is accessing value 4
Thread 2 is accessing value 5
Thread 4 is accessing value 6
Thread 3 is accessing value 7
Thread 6 is accessing value 8
Final value of the addition is 8
Not using critical
Thread 6 is accessing value 1
Thread 7 is accessing value 2
Thread 2 is accessing value 2
Thread 4 is accessing value 2
Thread 3 is accessing value 2
Thread 0 is accessing value 3
Thread 5 is accessing value 2
Thread 1 is accessing value 2
Final value of the addition is 3
```
## Exra Resources
* http://openmp.org/wp/resources/
* https://msdn.microsoft.com/en-us/library/b38674ky.aspx
| 18.527273 | 66 | 0.757605 | eng_Latn | 0.982273 |
3be5a6d7b6d7e8b2afc83784e67aca9d1416f1e9 | 797 | md | Markdown | lista_ejercicios.md | 7german7/python_comands | 2c889235003f6afadc5cfab4b90980b5963ec2e8 | [
"MIT"
] | 2 | 2020-05-15T13:48:45.000Z | 2020-05-18T01:06:22.000Z | lista_ejercicios.md | 7german7/python_comands | 2c889235003f6afadc5cfab4b90980b5963ec2e8 | [
"MIT"
] | 1 | 2020-05-15T18:42:31.000Z | 2020-05-15T18:42:31.000Z | lista_ejercicios.md | 7german7/python_comands | 2c889235003f6afadc5cfab4b90980b5963ec2e8 | [
"MIT"
] | 1 | 2020-05-15T13:48:48.000Z | 2020-05-15T13:48:48.000Z | # Lista de Ejercicios por hacer
* ~~[A trabajar con strings](https://platzi.com/clases/1104-python/7091-basicos-de-strin-6/ "Ir")~~
* ~~[Codificar los strings](https://platzi.com/clases/1104-python/7093-comparacion-de-strings-y-unico-1/ "Ir")~~
* ~~[A trabajar con ciclos](https://platzi.com/clases/1104-python/7094-ciclos-en-python-con-for/ "Ir")~~
* [El Ahorcado](https://platzi.com/clases/1104-python/7101-logica-del-ahorcado/ "Ir")
* [Busqueda Binaria](https://platzi.com/clases/1104-python/7103-implementar-busqueda-binar-7/ "Ir")
* [Encriptar mensajes usando diccionarios](https://platzi.com/clases/1104-python/7105-encriptar-mensajes-usando-diccionarios/ "Ir")
* [Encontrar primer caracter que no se repita en un string](https://platzi.com/clases/1104-python/7107-ejemplo-con-tupl-3/ "Ir") | 88.555556 | 131 | 0.74404 | spa_Latn | 0.534515 |
3be6700d674f63a86fc86438a285e12d849f6727 | 1,094 | md | Markdown | README.md | bh4sith/nova-search-engine-ui | 37b88068b407e0a55489a1e23a61973772cceaf8 | [
"MIT"
] | null | null | null | README.md | bh4sith/nova-search-engine-ui | 37b88068b407e0a55489a1e23a61973772cceaf8 | [
"MIT"
] | null | null | null | README.md | bh4sith/nova-search-engine-ui | 37b88068b407e0a55489a1e23a61973772cceaf8 | [
"MIT"
] | null | null | null | # nova-search-engine-ui
UI for the simple web search engine https://github.com/bh4sith/nova-search-engine
## INSTALL AND RUN
### REQUIREMENTS
This tool requires *Python3+* and the web search engine API (see link above).
### WITH PIP
```
git clone https://github.com/bh4sith/nova-search-engine-ui.git
cd nova-search-engine-ui
pip install -r requirements.txt
```
Then, run the tool :
```
FLASK_APP=index.py HOST=<ip> PORT=<port> flask run --port 80
```
Where :
* `ip` + `port` : route to web search engine API
To run in debug mode, prepend `FLASK_DEBUG=1` to the command :
```
FLASK_DEBUG=1 ... flask run --port 80
```
### WITH DOCKER
build yourself a Docker image :
```
git clone https://github.com/bh4sith/nova-search-engine-ui.git
cd nova-search-engine-ui
docker build -t nova-search-engine-ui .
```
```
docker run -p 80:5000 \
-e "HOST=<ip>" \
-e "PORT=<port>" \
bh4sith/nova-search-engine-ui
```
Where :
* `ip` + `port` : route to web search engine API
## USAGE AND EXAMPLES
To use the search engine, just type this endpoint in your web browser : http://localhost/
## LICENCE
MIT
| 21.038462 | 89 | 0.695612 | yue_Hant | 0.417868 |
3be67ba29d47c06d4d5198944051f497e41c36e4 | 1,215 | md | Markdown | notes/0.2.0.md | scalafx/scalafx-extras | 6276ffe987ec06486a03f571432f86f60344b4d0 | [
"BSD-3-Clause"
] | 9 | 2017-07-21T08:12:43.000Z | 2022-02-22T00:27:44.000Z | notes/0.2.0.md | scalafx/scalafx-extras | 6276ffe987ec06486a03f571432f86f60344b4d0 | [
"BSD-3-Clause"
] | 11 | 2018-11-15T02:52:00.000Z | 2021-12-16T01:16:27.000Z | notes/0.2.0.md | scalafx/scalafx-extras | 6276ffe987ec06486a03f571432f86f60344b4d0 | [
"BSD-3-Clause"
] | 2 | 2017-04-04T20:46:21.000Z | 2017-04-05T02:36:47.000Z | ### ScalaFX-Extras Release v.0.2.0
This a feature release, several new concepts were added:
* The mixin `ShowMessage` makes it easier to display dialogs.
* The `BusyWorker` helps running UI tasks a separate threads (other than the JavaFX Application thread).
It gives an option to show progress and status messages.
* The `ImageDisplay` component for showing images with ability to zoom in, zoom out, zoom to fit.
It can also automatically resizes to parent size.
There were also significant changes to the Model-View-X pattern code.
The `View` changed name to `ControllerFX` to match naming used in JavaFX.
`Model` was renamed to `ModelFX`.
`ModelView` was renamed to `MVCfx`.
There were some other changed for smoother integration with the new `ShowMessage` and `BusyWorker` classes.
The ending `FX` was added to avoid conflicts with ScalaFXML macros clashing with name `Controller`.
The are new demos, in `scalafx-extras-demos` project that illustrate the use of the `scalafx-extras` features.
To post questions please use [ScalaFX Users Group][5] or [StackOverflow ScalaFX][6]
[5]: https://groups.google.com/forum/#!forum/scalafx-users
[6]: https://stackoverflow.com/questions/tagged/scalafx
| 50.625 | 110 | 0.768724 | eng_Latn | 0.994944 |
3be8aa92668487a9cbbb7495dd895b8ee7e74029 | 4,606 | md | Markdown | docs/framework/winforms/user-input-in-a-windows-forms-application.md | Jonatandb/docs.es-es | c18663ce8a09607fe195571492cad602bc2f01bb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/winforms/user-input-in-a-windows-forms-application.md | Jonatandb/docs.es-es | c18663ce8a09607fe195571492cad602bc2f01bb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/winforms/user-input-in-a-windows-forms-application.md | Jonatandb/docs.es-es | c18663ce8a09607fe195571492cad602bc2f01bb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Datos proporcionados por el usuario en una aplicación Windows Forms
titleSuffix: ''
ms.date: 03/30/2017
helpviewer_keywords:
- Windows Forms, user input
ms.assetid: 9d61fa96-70f7-4754-885a-49a4a6316bdb
ms.openlocfilehash: 8e82276f14519c4ef54948744c93014232bdff52
ms.sourcegitcommit: de17a7a0a37042f0d4406f5ae5393531caeb25ba
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 01/24/2020
ms.locfileid: "76734806"
---
# <a name="user-input-in-a-windows-forms-application"></a>Datos proporcionados por el usuario en una aplicación de Windows Forms
En Windows Forms, los datos proporcionados por el usuario se envían a las aplicaciones en forma de mensajes de Windows. Una serie de métodos reemplazables procesa estos mensajes en el nivel de aplicación, formulario y control. Cuando estos métodos reciben los mensajes del mouse y del teclado, generan eventos que se pueden controlar para obtener información sobre la entrada del mouse o del teclado. En muchos casos, las aplicaciones Windows Forms podrán procesar todos los datos proporcionados por el usuario mediante el control de estos eventos. En otros casos, es posible que una aplicación necesite reemplazar uno de los métodos que procesan los mensajes para interceptar un mensaje determinado antes de que lo reciba la aplicación, el formulario o el control.
## <a name="mouse-and-keyboard-events"></a>Eventos del mouse y del teclado
Todos los controles de Windows Forms heredan un conjunto de eventos relacionados con la entrada de mouse y teclado. Por ejemplo, un control puede controlar el evento <xref:System.Windows.Forms.Control.KeyPress> para determinar el código de carácter de una tecla que se presionó, o un control puede controlar el evento <xref:System.Windows.Forms.Control.MouseClick> para determinar la ubicación de un clic del mouse. Para obtener más información sobre los eventos del mouse y del teclado, vea [usar eventos de teclado](using-keyboard-events.md) y [eventos del mouse en Windows Forms](mouse-events-in-windows-forms.md).
## <a name="methods-that-process-user-input-messages"></a>Métodos que procesan los mensajes de entrada del usuario
Los formularios y controles tienen acceso a la interfaz de <xref:System.Windows.Forms.IMessageFilter> y un conjunto de métodos reemplazables que procesan los mensajes de Windows en distintos puntos de la cola de mensajes. Todos estos métodos tienen un parámetro <xref:System.Windows.Forms.Message>, que encapsula los detalles de bajo nivel de los mensajes de Windows. Puede implementar o invalidar estos métodos para examinar el mensaje y, a continuación, utilizar el mensaje o pasarlo al siguiente consumidor en la cola de mensajes. En la tabla siguiente se presentan los métodos que procesan todos los mensajes de Windows en Windows Forms.
|Método|Notas|
|------------|-----------|
|<xref:System.Windows.Forms.IMessageFilter.PreFilterMessage%2A>|Este método intercepta mensajes de Windows en cola (también conocidos como publicados) en el nivel de aplicación.|
|<xref:System.Windows.Forms.Control.PreProcessMessage%2A>|Este método intercepta los mensajes de Windows en el nivel de formulario y de control antes de que se hayan procesado.|
|<xref:System.Windows.Forms.Control.WndProc%2A>|Este método procesa los mensajes de Windows en el nivel de formulario y de control.|
|<xref:System.Windows.Forms.Control.DefWndProc%2A>|Este método realiza el procesamiento predeterminado de los mensajes de Windows en el nivel de formulario y de control. Esto proporciona la funcionalidad mínima de una ventana.|
|<xref:System.Windows.Forms.Control.OnNotifyMessage%2A>|Este método intercepta los mensajes en el nivel de formulario y control, una vez que se han procesado. Se debe establecer el bit de estilo <xref:System.Windows.Forms.ControlStyles.EnableNotifyMessage> para que se llame a este método.|
Los mensajes del teclado y del mouse también se procesan mediante un conjunto adicional de métodos reemplazables que son específicos de esos tipos de mensajes. Para obtener más información, vea [Cómo funciona la entrada mediante teclado](how-keyboard-input-works.md) y [Cómo funciona la entrada del mouse en Windows Forms](how-mouse-input-works-in-windows-forms.md).
## <a name="see-also"></a>Consulte también
- [Datos proporcionados por el usuario en Windows Forms](user-input-in-windows-forms.md)
- [Entradas mediante teclado en una aplicación de Windows Forms](keyboard-input-in-a-windows-forms-application.md)
- [Entradas mediante el mouse en una aplicación de Windows Forms](mouse-input-in-a-windows-forms-application.md)
| 118.102564 | 767 | 0.799826 | spa_Latn | 0.981438 |
3be8d9d21c2b98b9c0bd2e9a30c86bfc3ff2a3df | 1,401 | md | Markdown | windows-driver-docs-pr/print/installing-pages-for-a-printer-type.md | i35010u/windows-driver-docs.zh-cn | e97bfd9ab066a578d9178313f802653570e21e7d | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-02-04T01:49:58.000Z | 2021-02-04T01:49:58.000Z | windows-driver-docs-pr/print/installing-pages-for-a-printer-type.md | i35010u/windows-driver-docs.zh-cn | e97bfd9ab066a578d9178313f802653570e21e7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/print/installing-pages-for-a-printer-type.md | i35010u/windows-driver-docs.zh-cn | e97bfd9ab066a578d9178313f802653570e21e7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 安装打印机类型的页面
description: 安装打印机类型的页面
keywords:
- 安装自定义的打印网页 WDK
- 自定义的打印网页 WDK,安装
- 特定于打印机的安装 WDK
ms.date: 04/20/2017
ms.localizationpriority: medium
ms.openlocfilehash: b256054456f6cfa83679672bd678912c6387b707
ms.sourcegitcommit: 418e6617e2a695c9cb4b37b5b60e264760858acd
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 12/07/2020
ms.locfileid: "96796699"
---
# <a name="installing-pages-for-a-printer-type"></a>安装打印机类型的页面
如果打印机使用的是标准 TCP/IP 端口监视器,则可以安装特定于打印机类型的打印机详细信息页。 为此,请在打印机类型的 [打印机 INF 文件](printer-inf-files.md) 中包含该页的 ASP 文件以及所有从属文件 (如用于链接页的 .gif 文件或 ASP 文件) 。 下面是打印机 INF 文件的示例部分:
```cpp
[Manufacturer]
"ACME"
[ACME]
"ACME Mega Laser" = ACML01.PPD
[ACML01.PPD]
CopyFiles=@ACML01.PPD,PSCRIPT,ACML1WEB
DataSection=PSCRIPT_DATA
[ACML1WEB]
PAGE1.ASP, ACML1.ASP ;ACML1.ASP renamed to PAGE1.ASP during installation
ACML2.ASP
ACGF001.GIF
[DestinationDirs]
DefaultDestiDir=66000
ACML1WEB=66004
```
当打印机类安装程序遇到此 INF 文件部分时,它将执行以下操作:
- 创建设置为 " < 根 > \\ < 制造商 > \\ < 打印机类型 > " 的目录。 在此示例中,将创建以下子目录:
..\\ACME \\ Acme 万像素激光
- 将 Acml1、Asml2 和 Acgf001.gif 复制到子目录中。
- 将 Acml1 重命名为 ACML1WEB) 部分中的第一个语句导致的 Page1 (。
请注意,你必须通过在第一个 ASP 文件的前面加上 Page1 文件名称来确定要查看的第一个 ASP 文件,如示例中所示。 安装程序将此文件重命名为目标目录中的 Page1。
Microsoft 保留所有格式为 Page *n.*.ASP 的 asp 文件名,其中 *N* 为1、2、3等。
示例 INF 文件随 Windows 驱动程序工具包中的 [示例 ASP 文件](sample-asp-files.md) 一起提供。
| 20.304348 | 165 | 0.748037 | yue_Hant | 0.943467 |
3be9aa585a2eed32b85cc5a85d7e5f5e938ed0f9 | 3,068 | md | Markdown | analyze/README.md | lukasbm/HiL-Simulator | bef04b22ae2d30951015bfd7a1bf83ba0254dd56 | [
"Unlicense"
] | 1 | 2021-12-02T21:09:48.000Z | 2021-12-02T21:09:48.000Z | analyze/README.md | lukasbm/HiL-Simulator | bef04b22ae2d30951015bfd7a1bf83ba0254dd56 | [
"Unlicense"
] | null | null | null | analyze/README.md | lukasbm/HiL-Simulator | bef04b22ae2d30951015bfd7a1bf83ba0254dd56 | [
"Unlicense"
] | null | null | null | # Python scripts for analysis
This folder provides several python scripts to make analyzing and comparing the measured data (from the AD HiL system) and simulated data (from the OMNeT model) easier.
The core idea behind the scripts is to first convert them into a common datastructure.
This makes plotting and comparing them much easier.
## Reading in the Data
the two core scripts `read_measured.py` and `read_simulation.py` read in provided CSV files and map them into the following data structure:
`result:Dict(timestamp:String, Array:Float)`
for example:
```python
{
'B0': [32.0, 32.0, 32.0, 32.0, 32.0],
'B1': [0.0, 0.0, 0.0, 0.0, 0.0],
'B2': [32.0, 32.0, 32.0, 32.0, 32.0],
'B3': [47.0, 47.0, 47.0, 47.0, 47.0],
'T0': [0.0, 0.01, 0.02, 0.03, 0.04],
'T1': [0.0, 0.009003421, 0.019036095, 0.029040064, 0.039036946],
'T2': [0.0, 0.008916953, 0.018985091, 0.029028868, 0.039029702],
'T3': [0.0, 0.008860722, 0.018928369, 0.028972534, 0.038972418],
'dT0': [0.01, 0.01, 0.01, 0.01, 0.01],
'dT1': [0.009003421, 0.010032674, 0.010003969, 0.009996882, 0.010002851],
'dT2': [0.008916953, 0.010068138, 0.010043777, 0.010000834, 0.009924362],
'dT3': [0.008860722, 0.010067647, 0.010044165, 0.009999884, 0.009923654],
...
}
```
The two scripts export these datastructures as `results`.
There is also the option of using slightly cleaned up versions of the results.
These will have zero values removed for example where they don't make sense.
These are exported as `clean_results`.
## Using the Data
just import the data as follows:
```python
from read_simulation import results as sr
from read_measured import results as mr
```
and use them to analyze, plot or compare as you please.
### Existing Scripts
I will now provide a short description of the existing python scripts (besided the read scripts):
- `bar_cpu_usage_simulation.py`: plots the simulated cpu usage as a bar chart. One bar for each core. (limited to one pc)
- `gantt_scheduler.py`: generate a gantt chart of the simulated tasks (on any core) (limited to one pc)
- `mapping.py`: helper file for `read_simulated.py`
- `plot_cpu_usage_measured.py`: plots the measured CPU usage from the kscc.csv files
- `plot_cpu_usage_simulation.py`: plots the simulated CPU usage from OMNeT's exported output.json file (all cores, all PCs)
- `plot_multiple_cpu_usage_simulation_verification.py`: plots simulated cpu usage in different probing intervals, used for verification in my thesis
- `plot_timestamp_diff_all.py`: plots the diff timings (dT`x`) for simulated, measured and as a boxplot
- `plot_timestamp_subtraction_all.py`: plots the latency times (T`x` - T`x-1`) for simulated, measured and as a boxplot
- `generate_traces.py`: generates traces from measured CSV data.
# Useful stuff:
- `awk '!seen[$0]++' kscc26398.csv > kscc26398_nodup.csv`: removes duplicate entries from the kscc CPU usage files.
- `simulations/<sim>/export.sh` removes long lines and exports the statistic vectors
# Bugs
- `generate_traces.py` sometimes generates negative values
| 47.2 | 168 | 0.72914 | eng_Latn | 0.953667 |
3beb47c43b92100984c870956716e3a2ad214d0d | 1,309 | md | Markdown | blog/online-activism.md | kaustavdm/website | 05acfd104234c1b70d0b7151db4a60a3f9a3a38a | [
"CC0-1.0"
] | 4 | 2015-12-15T08:21:02.000Z | 2020-03-06T10:44:58.000Z | blog/online-activism.md | kaustavdm/website | 05acfd104234c1b70d0b7151db4a60a3f9a3a38a | [
"CC0-1.0"
] | 4 | 2021-12-16T05:05:22.000Z | 2022-03-30T07:18:35.000Z | blog/online-activism.md | kaustavdm/website | 05acfd104234c1b70d0b7151db4a60a3f9a3a38a | [
"CC0-1.0"
] | 4 | 2015-12-26T15:56:09.000Z | 2019-01-09T10:44:59.000Z | ---
date: 2016-03-04
slug: online-activism
title: On online activism
tags:
- thoughts
- philosophy
---
If signing an online petition is _all_ that you _ever_ do about serious social, environmental, economic or political issues, then neither do you help change the current situation that you are complaining about, nor does it make any real difference.
It is even worse if you have signed those petitions because some of your friends have done the same.
Try to get to the root of the cause, try to understand it, and if you do understand the issue, raise your voice out louder. May be join the larger protests?
If you do not want to be on the streets or be public about your views, why not stay in your comfort zone and try devising actual solutions to these issues? Like raising awareness, instead of just complaining, may be by writing about it, or talking about it to your friends who do not understand the matter in concern very well. Even better, if you are a technologist, maybe you can come up with technology-driven fixes for those issues that can be fixed with better technology.
Why stay inside a sound-proof glass house and try shouting occasionally at the world outside? No one will hear you. They will just see you make weird gestures.
Why not push your limits and try to make a difference?
| 62.333333 | 477 | 0.788388 | eng_Latn | 0.999913 |
3beb5df97fdedbdc16308a8b0a59d3c3f11711bb | 418 | md | Markdown | content/publication/2012-pasj-64-3-m/index.md | miurare/academic-kickstart | f442a15dd9421c82d7833c578508cb85019c9d5b | [
"MIT"
] | null | null | null | content/publication/2012-pasj-64-3-m/index.md | miurare/academic-kickstart | f442a15dd9421c82d7833c578508cb85019c9d5b | [
"MIT"
] | null | null | null | content/publication/2012-pasj-64-3-m/index.md | miurare/academic-kickstart | f442a15dd9421c82d7833c578508cb85019c9d5b | [
"MIT"
] | null | null | null | +++
title = "$^13$CO(J = 1-0) On-the-Fly Mapping of the Giant H II Region NGC 604: Variation in Molecular Gas Density and Temperature due to Sequential Star Formation"
date = 2012-02-01
authors = ["K. Muraoka", "T. Tosaki", "R. Miura", "S. Onodera", "N. Kuno", "K. Nakanishi", "H. Kaneko", "S. Komugi"]
publication_types = ["2"]
abstract = ""
selected = "false"
publication = "*pasj*"
doi = "10.1093/pasj/64.1.3"
+++
| 34.833333 | 163 | 0.650718 | eng_Latn | 0.479215 |
3bed1ae5bfec9c71044a04931a1b39349b21a641 | 553 | md | Markdown | _notes/bread.md | IrfanZM/digital-garden-jekyll-template | 7cbe53e5ba7d9b40dba7a75c19561e3088f6b3e2 | [
"MIT"
] | null | null | null | _notes/bread.md | IrfanZM/digital-garden-jekyll-template | 7cbe53e5ba7d9b40dba7a75c19561e3088f6b3e2 | [
"MIT"
] | null | null | null | _notes/bread.md | IrfanZM/digital-garden-jekyll-template | 7cbe53e5ba7d9b40dba7a75c19561e3088f6b3e2 | [
"MIT"
] | null | null | null | ---
title: Bread
---
The starting point for my all bread exploration.
1. [[The regular bread recipes]], tried and tested
2. Experimental bread recipes, in the works and still need tweaking/practice
3. Untried bread recipes, that I want to try but haven't yet
4. Bread theory, the science behind the bread
My goal at the moment is to have the following nailed down:
- Standard, quick white bread and wholemeal bread
- White sourdough
- Rye sourdough
- A dessert bread/cinnamon roll kind of thing
- 3 or 4 "fancy" breads
- Including a braided bread
| 27.65 | 76 | 0.755877 | eng_Latn | 0.999313 |
3bed216a1425cd7e71b68de0a264266095820850 | 5,833 | md | Markdown | src/posts/2021-08-05-foodshare0803.md | meshellg/ideal-giggle | 8ca4b82de51a175d15786e532afb8e922320672a | [
"MIT"
] | null | null | null | src/posts/2021-08-05-foodshare0803.md | meshellg/ideal-giggle | 8ca4b82de51a175d15786e532afb8e922320672a | [
"MIT"
] | null | null | null | src/posts/2021-08-05-foodshare0803.md | meshellg/ideal-giggle | 8ca4b82de51a175d15786e532afb8e922320672a | [
"MIT"
] | null | null | null | ---
title: "FoodShare DWS FoodBox recipes 08/03"
date: "2021-08-05"
tags:
- "recipes"
thumb: "2021-08-03-vegetable-box-500x250.jpg"
---
Here I am again, excitedly planning out some meals from our recently arrived [Dismantling White Supremacy box](https://goodfoodbox.foodshare.net/collections/organic/products/large-food-justice-box). It helps me a lot to have a few ideas of what to do with our veggies, and last week's missed box totally reduced our green vegetable consumption significantly, so I'm very excited about our box today. I like that all the produce is grown nearby, and that it often inspires me to try new recipes I might not have otherwise tried.
![](images/2021-08-03-vegetable-box-500x250.jpg)
## Here's what we got this week\*:
\*some of these could probably be identified a bit more precisely, but I'm not very good at my tomato variety identification!
- heirloom tomatoes
- cherry tomatoes
- shishito peppers
- chard
- perilla leaves
- pole beans
- Tokyo bekana
- field cucumber
- zucchini
- spring onions
![](images/perilla.jpg)
_this is a close up image of a stack of perilla_
It's a similar box to what was on offer last week BUT I didn't actually get any of these things because I ordered the box incorrectly. I'm glad to get a chance to try **perilla** leaves, because I was very excited to try these [Korean Mushroom lettuce wraps](https://christieathome.com/blog/korean-mushroom-lettuce-wraps/) from Christie at Home, I've ordered some [ssamjang](https://haisue.ca/product/daesang-seasoned-soybean-paste-mild-ssamjang-500g/) from an online store called Haisue, based in the GTA, but you can get it at pretty much any Asian grocery store, or [make your own](http://veganseoulfood.weebly.com/recipe/korean-dipping-sauce-ssamjang-recipe)!
![](images/tokyo-bekhana.jpg)
_this is a close up image of Tokyo bekana_
I've had my eye on these [Tokyo Bekana Spring Rolls](https://www.wozupi.com/blog/recipe/tokyo-bekana-spring-rolls) from Wozupi Tribal Gardens as well. Like I mentioned [last week](https://meshell.ca/blog/another-week-of-foodshares-dws-box/), Tokyo **bekana** can be used in a variety of ways, and similar to napa cabbage, bok choy, or pak choy and can be used in stir fries and as a tender salad green. I'm tempted to try it in the salad rolls mentioned above, wrapped with the bekana, but this [salad with tahini and currents](https://granorfarm.com/recipes/bekana-with-sesame-cashews-and-currants) from Granor farm in Michigan looks like a great option as well.
One of the things I have been hoping to make again with some fresh **green onions** are [Green Onion Cakes,](https://www.thestar.com/edmonton/2018/04/10/siu-to-78-is-believed-to-be-the-man-who-popularized-green-onion-cakes-in-edmonton.html) following the simple but perfect recipe from Siu To in Edmonton, who has since started a [restaurant devoted to green onion cakes](https://www.greenonioncakeman.com/). But there is also this Chinese inspired [green onion spaghetti](http://www.marystestkitchen.com/chinese-inspired-green-onion-spaghetti/) recipe from the always wonderful Mary's Test Kitchen that calls to me. Part of me wants to make it with a mix of regular pasta and zucchini noodles, but then for the **zucchini** there is this magically simple [zucchini noodle dish](https://thekoreanvegan.com/creamy-zucchini-spaghetti/) with roasted zucchini and a creamy bean sauce from the Korean Vegan.
![](images/shishito.jpg)
_the usual shishito preparation_
I have ways I like of preparing vegetables, and without some extra effort or research to find alternatives, I'll just keep doing the same way I've always done it. Like with **shishito peppers**, I'm not sure what I'd do that wasn't just simply frying them in a bit of oil and salt until they're blistered, but this addition of a [spicy miso sauce](https://veganmiam.com/recipes/spicy-miso-shishito-peppers) from Vegan Miam is just the extra angle I was looking.
Putting the **chard** greens in the Efo Riro was a successful and delicious addition, but someone mentioned a ribolitta to me and I have become fixated on the idea of making some at home, and this [Cozy and Comforting Ribollita](https://www.yumsome.com/cosy-and-comforting-ribollita-tuscan-bread-soup/) from yumsome is a hearty soup that looks perfect for some of next week's rainy days.
I can see the future and the **heirloom tomatoes** are going to be chopping up into a salad with the **cucumber**, some onion, and some herbs. Maybe wit a tofu feta, like [this recipe from](https://cadryskitchen.com/tomato-cucumber-salad/) Cadry's Kitchen, or maybe a [classic panzanella salad](https://www.seriouseats.com/classic-panzanella-salad-recipe) to go along with my ribollita. After the [panzanella from Gia](https://meshell.ca/blog/gia-toronto-first-tastes/) last month, I've been looking forward to having it again soon. I was lucky enough this year to have a pair of ample basil plants, so my Mediterranean dishes have really been shining this summer.
There seems to be a pasta-centric theme going with my menu planning this week, but finally, I'm aiming to do a [Simple Vegan Burst Cherry Tomato pasta](https://www.thefullhelping.com/simple-vegan-burst-cherry-tomato-pasta/) from The Full Helping. That is, if we manage not to snack on all the tomatoes between now and then. I also love the look of this [Pesto Risotto with roasted chickpeas and tomatoes.](https://crumbsandcaramel.com/vegan-pesto-risotto-roasted-tomatoes-chickpeas/) Roasted cherry tomatoes are some of the most delicious little bites I've had the chance to eat.
Hopefully you find some recipes that spark your eye in this collection, I will post a follow up of some of the recipes we ended up trying, and what we thought either on the blog next Thursday, or on Instagram as it happens.
Best wishes as always. Ciao!
| 102.333333 | 902 | 0.778502 | eng_Latn | 0.99468 |
3bee782aad40389f0fc61d73b89110f52360aed8 | 394 | md | Markdown | simplified-migration-from3master/README.md | jonathangreen/Simplified-Android-Core | d832b391188c86858158ef702f308c6b507ddb61 | [
"Apache-2.0"
] | 8 | 2020-01-30T15:18:17.000Z | 2021-08-13T04:48:59.000Z | simplified-migration-from3master/README.md | jonathangreen/Simplified-Android-Core | d832b391188c86858158ef702f308c6b507ddb61 | [
"Apache-2.0"
] | 147 | 2020-01-10T10:45:10.000Z | 2022-03-07T18:29:57.000Z | simplified-migration-from3master/README.md | jonathangreen/Simplified-Android-Core | d832b391188c86858158ef702f308c6b507ddb61 | [
"Apache-2.0"
] | 6 | 2020-07-16T02:40:47.000Z | 2021-09-07T14:15:55.000Z | org.librarysimplified.migration.from3master
===
The `org.librarysimplified.migration.from3master` module provides a
data migration service to migrate data from the old 3.x version of the
Library Simplified application.
#### See Also
* [org.librarysimplified.migration.api](../simplified-migration-api/README.md)
* [org.librarysimplified.migration.spi](../simplified-migration-spi/README.md)
| 32.833333 | 78 | 0.796954 | eng_Latn | 0.713839 |
3bef2dc57fd580284fff7efc0b6782a16f7e90ed | 22,259 | md | Markdown | backend/php/lnmp.md | grakke/note | a3596d6e4bb3751e48e427ef984732a49f2870ce | [
"MIT"
] | 2 | 2021-02-23T04:11:43.000Z | 2021-12-28T10:27:09.000Z | backend/php/lnmp.md | grakke/note | a3596d6e4bb3751e48e427ef984732a49f2870ce | [
"MIT"
] | 8 | 2021-04-28T09:44:34.000Z | 2021-12-28T10:09:40.000Z | backend/php/lnmp.md | grakke/note | a3596d6e4bb3751e48e427ef984732a49f2870ce | [
"MIT"
] | null | null | null | ## [lnmp](https://github.com/lj2007331/lnmp)
# php
LEMP stack/LAMP stack/LNMP stack installation scripts for CentOS/Redhat Debian and Ubuntu <https://blog.linuxeye.cn/31.html>
- 编译安装
- 软件源代码包存放位置:/usr/local/src
- 源码包编译安装位置:.configure 参数
## Nginx
- Ubuntu
- Server Configuration:/etc/nginx
- /etc/nginx: The Nginx configuration directory. All of the Nginx configuration files reside here.
- /etc/nginx/nginx.conf: The main Nginx configuration file. This can be modified to make changes to the Nginx global configuration.
- /etc/nginx/sites-available/: The directory where per-site "server blocks" can be stored. Nginx will not use the configuration files found in this directory unless they are linked to the sites-enabled directory (see below). Typically, all server block configuration is done in this directory, and then enabled by linking to the other directory.
- /etc/nginx/sites-enabled/: The directory where enabled per-site "server blocks" are stored. Typically, these are created by linking to configuration files found in the sites-available directory.
- /etc/nginx/snippets: This directory contains configuration fragments that can be included elsewhere in the Nginx configuration. Potentially repeatable configuration segments are good candidates for refactoring into snippets.
- Content:/var/www/html
- Server Logs
- /var/log/nginx/access.log: Every request to your web server is recorded in this log file unless Nginx is configured to do otherwise.
- /var/log/nginx/error.log: Any Nginx errors will be recorded in this log.
```sh
wget http://nginx.org/keys/nginx_signing.key
sudo apt-key add nginx_signing.key
// 或者
echo "deb http://nginx.org/packages/mainline/ubuntu/ trusty nginx" >> /etc/apt/sources.list // Nginx1.9以上的版本可以在packages后添加/mainline,这是主线版本
echo "deb-src http://nginx.org/packages/mainline/ubuntu/ trusty nginx" >> /etc/apt/sources.listen
apt-get install python-software-properties
add-apt-repository ppa:nginx/stable
apt-get update
apt-get install nginx
sudo systemctl stop|start|restart|reload|disable|enable nginx.service
# 编译
groupadd www
useradd -g www www -s /bin/false
cd /usr/local/src
tar zxvf nginx-1.6.0.tar.gz
cd nginx-1.6.0
./configure --prefix=/usr/local/nginx --without-http_memcached_module --user=www --group=www --with-http_stub_status_module --with-http_ssl_module --with-http_gzip_static_module --with-openssl=/usr/local/src/openssl-1.0.1h --with-zlib=/usr/local/src/zlib-1.2.8 --with-pcre=/usr/local/src/pcre-8.35
# 注意:--with-openssl=/usr/local/src/openssl-1.0.1h --with-zlib=/usr/local/src/zlib-1.2.8 --with-pcre=/usr/local/src/pcre-8.35 # 指向的是源码包解压的路径,而不是安装的路径,否则会报错
make && make install
# (服务管理脚本)[../../Ops/nginx.service]
chmod 775 /etc/rc.d/init.d/nginx #赋予文件执行权限
## /etc/init.d/nginx
#!/bin/bash
# chkconfig: - 30 21
# description: http service.
# Source Function Library
. /etc/init.d/functions
# Nginx Settings
NGINX_SBIN="/usr/local/nginx/sbin/nginx"
NGINX_CONF="/usr/local/nginx/conf/nginx.conf"
NGINX_PID="/usr/local/nginx/logs/nginx.pid"
RETVAL=0
prog="Nginx"
start() {
echo -n $"Starting $prog: "
mkdir -p /dev/shm/nginx_temp
daemon $NGINX_SBIN -c $NGINX_CONF
RETVAL=$?
echo
return $RETVAL
}
stop() {
echo -n $"Stopping $prog: "
killproc -p $NGINX_PID $NGINX_SBIN -TERM
rm -rf /dev/shm/nginx_temp
RETVAL=$?
echo
return $RETVAL
}
reload(){
echo -n $"Reloading $prog: "
killproc -p $NGINX_PID $NGINX_SBIN -HUP
RETVAL=$?
echo
return $RETVAL
}
restart(){
stop
start
}
configtest(){
$NGINX_SBIN -c $NGINX_CONF -t
return 0
}
case "$1" in
start)
start
;;
stop)
stop
;;
reload)
reload
;;
restart)
restart
;;
configtest)
configtest
;;
*)
echo $"Usage: $0 {start|stop|reload|restart|configtest}"
RETVAL=1
esac
exit $RETVAL
chkconfig nginx on #设置开机启动
sudo /sbin/chkconfig --list nginx
/etc/rc.d/init.d/nginx restart # 重启
/usr/local/nginx/sbin/nginx
# /usr/local/nginx/conf/nginx.conf
user nobody nobody;
worker_processes 2;
error_log /usr/local/nginx/logs/nginx_error.log crit;
pid /usr/local/nginx/logs/nginx.pid;
worker_rlimit_nofile 51200;
events
{
use epoll;
worker_connections 6000;
}
http
{
include mime.types;
default_type application/octet-stream;
server_names_hash_bucket_size 3526;
server_names_hash_max_size 4096;
log_format combined_realip '$remote_addr $http_x_forwarded_for [$time_local]'
'$host "$request_uri" $status'
'"$http_referer" "$http_user_agent"';
sendfile on;
tcp_nopush on;
keepalive_timeout 30;
client_header_timeout 3m;
client_body_timeout 3m;
send_timeout 3m;
connection_pool_size 256;
client_header_buffer_size 1k;
large_client_header_buffers 8 4k;
request_pool_size 4k;
output_buffers 4 32k;
postpone_output 1460;
client_max_body_size 10m;
client_body_buffer_size 256k;
client_body_temp_path /usr/local/nginx/client_body_temp;
proxy_temp_path /usr/local/nginx/proxy_temp;
fastcgi_temp_path /usr/local/nginx/fastcgi_temp;
fastcgi_intercept_errors on;
tcp_nodelay on;
gzip on;
gzip_min_length 1k;
gzip_buffers 4 8k;
gzip_comp_level 5;
gzip_http_version 1.1;
gzip_types text/plain application/x-javascript text/css text/htm application/xml;
server
{
listen 80;
server_name localhost;
index index.html index.htm index.php;
root /usr/local/nginx/html;
location ~ \.php$ {
include fastcgi_params;
fastcgi_pass unix:/tmp/php-fcgi.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME /usr/local/nginx/html$fastcgi_script_name;
}
}
}
# /usr/local/nginx/conf/nginx.conf
user www www; # 首行user去掉注释,修改Nginx运行组为www www;必须与/usr/local/php/etc/php-fpm.conf中的user,group配置相同,否则php运行出错
index index.html index.htm index.php; # 添加index.php
# /usr/local/nginx/conf/servers/
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
location ~ \.php$ {
root html;
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
# 取消FastCGI server部分location的注释,注意fastcgi_param行的参数,改为$document_root$fastcgi_script_name,或者使用绝对路径
/etc/init.d/nginx restart # 重启nginx
service php-fpm start # 启动php-fpm
```
## MySQL
```sh
sudo apt-get install mysql-server(mariadb-server mariadb-client) mysql-client
sudo mysql_secure_installation
# There are three levels of password validation policy:
# LOW Length >= 8
# MEDIUM Length >= 8, numeric, mixed case, and special characters
# STRONG Length >= 8, numeric, mixed case, special characters and dictionary
sudo service mysql status
sudo mysqladmin -p -u root version
# 编译
wget http://www.lishiming.net/data/at ... -icc-glibc23.tar.gz
tar zxvf mysql-5.1.40-linux-i686-icc-glibc23.tar.gz
mv mysql-5.1.40-linux-i686-icc-glibc23.tar.gz /usr/local/mysql
groupadd mysql #添加mysql组
useradd -g mysql mysql -s /bin/false #创建用户mysql并加入到mysql组,不允许mysql用户直接登录系统
useradd -s /sbin/nologin mysql # -s /sbin/nologin 表示mysql账号不能登陆linux
mkdir -p /data/mysql #创建MySQL数据库存放目录
chown -R mysql:mysql /data/mysql #设置MySQL数据库存放目录权限
mkdir -p /usr/local/mysql #创建MySQL安装目录
cmake . -DCMAKE_INSTALL_PREFIX=/usr/local/mysql -DMYSQL_DATADIR=/data/mysql -DSYSCONFDIR=/etc # 配置
make && make install
rm -rf /etc/my.cnf #删除系统默认的配置文件(如果默认没有就不用删除)
cd /usr/local/mysql #进入MySQL安装目录
./scripts/mysql_install_db --user=mysql --basedir=/usr/local/mysql --datadir=/data/mysql #--user定义所属主,datadir 为数据库存放路径;这一步若出现两个OK就说明进行正确。
ln -s /usr/local/mysql/my.cnf /etc/my.cnf # 添加到/etc目录的软连接
cp ./support-files/mysql.server /etc/rc.d/init.d/mysqld #把Mysql加入系统启动
chmod 755 /etc/init.d/mysqld # 增加执行权限
# /etc/init.d/mysqld /etc/rc.d/init.d/mysqld 找到“datadir=”改过
datadir=/data/mysql
basedir=/usr/local/mysql #MySQL程序安装路径
datadir=/data/mysql #MySQl数据库存放目录
chkconfig -add mysqld
chkconfig mysqld on # 加入开机启动
# /etc/profile
export PATH=$PATH:/usr/local/mysql/bin
source /etc/profil
service mysqld start # 启动
# 下面这两行把myslq的库文件链接到系统默认的位置,这样在编译类似PHP等软件时可以不用指定mysql的库文件地址
ln -s /usr/local/mysql/lib/mysql /usr/lib/mysql
ln -s /usr/local/mysql/include/mysql /usr/include/mysql
mkdir /var/lib/mysql # 创建目录
ln -s /tmp/mysql.sock /var/lib/mysql/mysql.sock #添加软链接
mysql_secure_installation #设置Mysql密码,根据提示按Y 回车输入2次密码
```
## PHP
- 把PHP请求都发送到同一个文件上,然后在此文件里通过解析「REQUEST_URI」实现路由
- Modules
- curl
- GD
- pear
- mcrypt
- mbstring:php7.1-mbstring
- intl
- dom:php7.1-dom
- 启用 mcrypt `sudo phpenmod mcrypt`
```sh
sudo apt-get install python-software-properties software-properties-common
sudo add-apt-repository ppa:ondrej/php
sudo apt-get update
sudo apt-cache search php7.1*
sudo apt-get install php7.2
php7.2-fpm php7.2-mysql php7.2-common php7.2-curl php7.2-cli php7.2-mbstring php7.2-xml php7.2-bcmath php7.2-mcrypt php7.2-json php7.2-cgi php7.2-gd php-pear php7.2-intl php7.2-soap php7.2-xdebug php7.2-xsl php7.2-zip php7.2-xmlrpc php7.2-imagick php7.2-dev php7.2-imap php7.2-opcache -y
## 编译
wegt https://www.php.net/distributions/php-7.3.5.tar.gz
cd /usr/local/src
tar -zvxf php-7.3.5.tar.gz
cd php-7.3.5
sudo apt install gcc make openssl curl libbz2-dev libxml2-dev libjpeg-dev libpng-dev libfreetype6-dev pkg-config libzip-dev bison autoconf build-essential pkg-config git-core libltdl-dev libbz2-dev libxml2-dev libxslt1-dev libssl-dev libicu-dev libpspell-dev libenchant-dev libmcrypt-dev libpng-dev libjpeg8-dev libfreetype6-dev libmysqlclient-dev libreadline-dev libcurl4-openssl-dev librecode-dev libsqlite3-dev libonig-dev libicu-dev
# No package ‘oniguruma’ found
git clone https://github.com/kkos/oniguruma.git oniguruma
cd oniguruma
./autogen.sh
./configure
make
make install
./configure --prefix=/usr/local/php7 --enable-fpm --with-fpm-user=www-data --with-fpm-group=www-data --with-pdo-mysql --with-zlib --enable-xml --enable-bcmath --enable-shmop --enable-sysvsem --enable-inline-optimization --with-curl --enable-mbstring --with-openssl --enable-pcntl --enable-sockets --with-xmlrpc --enable-soap --without-pear --with-gettext --enable-intl --enable-maintainer-zts --enable-exif --enable-calendar --enable-opcache --enable-session --with-iconv --with-pdo-mysql=mysqlnd
export LD_LIBRARY_PATH=/usr/local/libgd/lib
make && make install
cp php.ini-production /usr/local/php/etc/php.ini # 复制php配置文件
rm -rf /etc/php.ini # 删除系统自带配置文件
ln -s /usr/local/php/etc/php.ini /etc/php.ini #添加软链接到 /etc目录
cp /usr/local/php/etc/php-fpm.d/www.conf.default /usr/local/php/etc/php-fpm.d/www.conf
cp -R ./sapi/fpm/php-fpm /etc/init.d/php-fpm || cp ./sapi/fpm/init.d.php-fpm /etc/init.d/php-fpm # 启动脚本
cp /usr/local/php/etc/php-fpm.conf.default /usr/local/php/etc/php-fpm.conf # 拷贝模板文件为php-fpm配置文件
ln -s /usr/local/php/etc/php-fpm.conf /etc/php-fpm.conf #添加软连接到 /etc目录
# /usr/local/php/etc/php-fpm.conf 或者 /usr/local/php/etc/php-fpm.d/www.conf
user = www #设置php-fpm运行账号为www
group = www #设置php-fpm运行组为www
pid = run/php-fpm.pid #取消前面的分号
# 设置 php-fpm开机启动
chmod +x /etc/rc.d/init.d/php-fpm # 添加执行权限
chkconfig php-fpm on #设置开机启动
## /usr/local/php/php.ini
# 如果文件不存在,则阻止 Nginx 将请求发送到后端的 PHP-FPM 模块, 以避免遭受恶意脚本注入的攻击
disable_functions = passthru,exec,system,chroot,scandir,chgrp,chown,shell_exec,proc_open,proc_get_status,ini_alter,ini_alter,ini_restore,dl,openlog,syslog,readlink,symlink,popepassthru,stream_socket_server,escapeshellcmd,dll,popen,disk_free_space,checkdnsrr,checkdnsrr,getservbyname,getservbyport,disk_total_space,posix_ctermid,posix_get_last_error,posix_getcwd, posix_getegid,posix_geteuid,posix_getgid, posix_getgrgid,posix_getgrnam,posix_getgroups,posix_getlogin,posix_getpgid,posix_getpgrp,posix_getpid, posix_getppid,posix_getpwnam,posix_getpwuid, posix_getrlimit, posix_getsid,posix_getuid,posix_isatty, posix_kill,posix_mkfifo,posix_setegid,posix_seteuid,posix_setgid, posix_setpgid,posix_setsid,posix_setuid,posix_strerror,posix_times,posix_ttyname,posix_uname
date.timezone = PRC #设置时区
expose_php = Off #禁止显示php版本的信息
short_open_tag = ON #支持php短标签
opcache.enable=1 #php支持opcode缓存
opcache.enable_cli=0
zend_extension=opcache.so #开启opcode缓存功能
cgi.fix_pathinfo=0
# /usr/local/php/etc/php-fpm/conf
[global]
pid = /usr/local/php/var/run/php-fpm.pid
error_log = /usr/local/php/var/log/php-fpm.log
[www]
listen = /tmp/php-fcgi.sock
user = php-fpm
group = php-fpm
pm = dynamic
pm.max_children = 50
pm.start_servers = 20
pm.min_spare_servers = 5
pm.max_spare_servers = 35
pm.max_requests = 500
rlimit_files = 1024
useradd -s /sbin/nologin php-fpm
cp /usr/local/src/php-5.3.27/sapi/fpm/init.d.php-fpm /etc/init.d/php-fpm
chmod 755 /etc/init.d/php-fpm
/usr/local/bin/php-fpm
chkconfig php-fpm on
sudo service php7.1-fpm restart
```
### Configure Nginx to Use the PHP Processor
- 语法检测:`sudo nginx -t`
- 配置server
- 服务重启:`sudo nginx -s reload`
- 添加域名
```sh
# /etc/php/7.0/fpm/pool.d
# listen = [::]:9000
#fastcgi_pass 127.0.0.1:9000;
listen = unix:/run/php/php7.2-fpm.sock
# default
server {
listen 80 default_server;
listen [::]:80 default_server;
root /var/www/html;
index index.php index.html index.htm index.nginx-debian.html;
server_name \_;
location / {
try_files $uri $uri/ =404;
}
location ~* \.php$ {
fastcgi_index index.php;
# fastcgi_pass 127.0.0.1:9000;
fastcgi_pass unix:/var/run/php-fpm.sock;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_param SCRIPT_NAME $fastcgi_script_name;
}
}
# 自定义server
server {
listen 80;
root /home/vagrant/www/example.local;
index index.php index.html index.htm;
server_name example.local;
location / {
try_files $uri $uri/ /index.php?$args;
}
location ~ \.php$ {
try_files $uri =404;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/var/run/php/php5.6-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
}
## 前后端分离
server {
listen 8082;
server_name localhost;
root /Users/henry/Workspace/ShareFolder/Front/dist;
index index.html index.htm
charset utf-8;
access_log /usr/local/var/log/nginx/front-test.access.log;
error_log /usr/local/var/log/nginx/front-test.error.log;
location / {
try_files $uri $uri/ /index.html;
}
location ~ /api/ {
proxy_pass http://tp5.app.local:8080;
}
}
sudo ln -s /etc/nginx/sites-available/example.com /etc/nginx/sites-enabled/
```
## 准备
```sh
# 基础
yum install -y apr* autoconf automake bison cloog-ppl compat* cpp curl curl-devel fontconfig fontconfig-devel freetype freetype* freetype-devel gcc gcc-c++ gtk+-devel gd gettext gettext-devel glibc kernel kernel-headers keyutils keyutils-libs-devel krb5-devel libcom_err-devel libpng* libjpeg* libsepol-devel libselinux-devel libstdc++-devel libtool* libgomp libxml2 libxml2-devel libXpm* libtiff libtiff* make mpfr ncurses* ntp openssl openssl-devel patch pcre-devel perl php-common php-gd policycoreutils telnet nasm nasm* wget zlib-devel
# 安装cmake
cd /usr/local/src
tar zxvf cmake-2.8.11.2.tar.gz
cd cmake-2.8.11.2
./configure
make && make install
# Nginx准备
# 安装pcre
cd /usr/local/src
mkdir /usr/local/pcre
tar zxvf pcre-8.35.tar.gz
cd pcre-8.35
./configure --prefix=/usr/local/pcre
make && make install
# 安装openssl
cd /usr/local/src
mkdir /usr/local/openssl
tar zxvf openssl-1.0.1h.tar.gz
cd openssl-1.0.1h
./config --prefix=/usr/local/openssl
make && make install
vi /etc/profile #把openssl服务加入系统环境变量:在最后添加下面这一行
export PATH=$PATH:/usr/local/openssl/bin
source /etc/profile #使配置立即生效
# 安装zlib
cd /usr/local/src
mkdir /usr/local/zlib
tar zxvf zlib-1.2.8.tar.gz
cd zlib-1.2.8
./configure --prefix=/usr/local/zlib
make && make install
# php 安装准备
# 安装yasm
cd /usr/local/src
tar zxvf yasm-1.2.0.tar.gz
cd yasm-1.2.0
./configure
make && make install
# 安装libmcrypt
cd /usr/local/src
tar zxvf libmcrypt-2.5.8.tar.gz
cd libmcrypt-2.5.8
./configure
make && make install
# 安装libvpx
cd /usr/local/src
tar xvf libvpx-v1.3.0.tar.bz2
cd libvpx-v1.3.0
./configure --prefix=/usr/local/libvpx --enable-shared --enable-vp9
make && make install
# 安装tiff
cd /usr/local/src
tar zxvf tiff-4.0.3.tar.gz
cd tiff-4.0.3
./configure --prefix=/usr/local/tiff --enable-shared
make && make install
# 安装libpng
cd /usr/local/src
tar zxvf libpng-1.6.12.tar.gz
cd libpng-1.6.12
./configure --prefix=/usr/local/libpng --enable-shared
make && make install
# 安装freetype
cd /usr/local/src
tar zxvf freetype-2.5.3.tar.gz
cd freetype-2.5.3
./configure --prefix=/usr/local/freetype --enable-shared
make && make install
# 安装jpeg
cd /usr/local/src
tar zxvf jpegsrc.v9a.tar.gz
cd jpeg-9a
./configure --prefix=/usr/local/jpeg --enable-shared
make && make install
# 安装libgd
cd /usr/local/src
tar zxvf libgd-2.1.0.tar.gz #解压
cd libgd-2.1.0 #进入目录
./configure --prefix=/usr/local/libgd --enable-shared --with-jpeg=/usr/local/jpeg --with-png=/usr/local/libpng --with-freetype=/usr/local/freetype --with-fontconfig=/usr/local/freetype --with-xpm=/usr/ --with-tiff=/usr/local/tiff --with-vpx=/usr/local/libvpx #配置
make && make install
# 安装t1lib
cd /usr/local/src
tar zxvf t1lib-5.1.2.tar.gz
cd t1lib-5.1.2
./configure --prefix=/usr/local/t1lib --enable-shared
make without_doc
make install
# 安装php
# 注意:如果系统是64位,请执行以下两条命令,否则安装php会出错(32位系统不需要执行)
ln -s /usr/lib64/libltdl.so /usr/lib/libltdl.so cp -frp /usr/lib64/libXpm.so* /usr/lib/
```
## Mac环境搭建
```sh
# 系统默认apache 与php5
httpd -v
php -v
# 停止httpd服务
sudo apachectl stop
# 卸载apache 与PHP56
sudo rm /usr/sbin/apachectl
sudo rm /usr/sbin/httpd
sudo rm -r /etc/apache2/ sudo rm -r /usr/bin/php
# 开启启动
$ cp /usr/local/opt/php71/homebrew.mxcl.php71.plist ~/Library/LaunchAgents/
$ launchctl load -w ~/Library/LaunchAgents/homebrew.mxcl.php70.plist
# 配置nginx
$ cp /usr/local/Cellar/nginx/1.10.2_1/homebrew.mxcl.nginx.plist ~/Library/LaunchAgents/
$ launchctl load -w ~/Library/LaunchAgents/homebrew.mxcl.nginx.plist
sudo chown root:wheel /usr/local/Cellar/nginx/1.10.2_1/bin/nginx
sudo chmod u+s /usr/local/Cellar/nginx/1.10.2_1/bin/nginx
sudo nginx -s reload/reopen/stop/quit
/usr/local/etc/nginx/nginx.conf
sudo brew services start nginx
curl -IL http://127.0.0.1:8080
sudo brew services stop nginx
mkdir -p /usr/local/etc/nginx/sites-available && \
mkdir -p /usr/local/etc/nginx/sites-enabled && \
mkdir -p /usr/local/etc/nginx/conf.d && \
mkdir -p /usr/local/etc/nginx/ssl
# mysql
brew install mysql
cd /usr/local/opt/mysql/
sudo vim my.cnf
./bin/mysql_install_db #初始化
mysql_install_db --verbose --user=`whoami` --basedir="$(brew --prefix mysql)" --datadir=/usr/local/var/mysql
/usr/local/bin/mysqladmin -u root password 'new-password' # 设置密码
/usr/local/bin/mysql_secure_installation # 安全脚本
mysql -u root -p
# PHP配置
brew install php71 --with-imap --with-tidy --with-debug --with-pgsql --with-mysql --with-fpm
/usr/local/etc/php/7.1/php-fpm.conf
;pid = run/php-fpm.log
;error_log = log/php-fpm.log
修改为
pid = /usr/local/var/run/php-fpm.pid
error_log = /usr/local/var/log/php-fpm.log
# php配置
/usr/local/etc/php/7.1/php.ini # 错误级别定义
/usr/local/etc/php/7.1/php-fpm.d/www.conf
brew services start php70
lsof -Pni4 | grep LISTEN | grep php
alias nginx.start='launchctl load -w ~/Library/LaunchAgents/homebrew.mxcl.nginx.plist'
alias nginx.stop='launchctl unload -w ~/Library/LaunchAgents/homebrew.mxcl.nginx.plist'
alias nginx.restart='nginx.stop && nginx.start'
alias php-fpm.start="launchctl load -w ~/Library/LaunchAgents/homebrew.mxcl.php54.plist"
alias php-fpm.stop="launchctl unload -w ~/Library/LaunchAgents/homebrew.mxcl.php54.plist"
alias php-fpm.restart='php-fpm.stop && php-fpm.start'
alias mysql.start="launchctl load -w ~/Library/LaunchAgents/homebrew.mxcl.mariadb.plist"
alias mysql.stop="launchctl unload -w ~/Library/LaunchAgents/homebrew.mxcl.mariadb.plist"
alias mysql.restart='mysql.stop && mysql.start'
openssl req -new -newkey rsa:4096 -days 365 -nodes -x509 -subj "/C=US/ST=State/L=Town/O=Office/CN=localhost" -keyout /usr/local/etc/nginx/ssl/localhost.key -out /usr/local/etc/nginx/ssl/localhost.crt
openssl req -new -newkey rsa:4096 -days 365 -nodes -x509 -subj "/C=US/ST=State/L=Town/O=Office/CN=phpmyadmin" -keyout /usr/local/etc/nginx/ssl/phpmyadmin.key -out /usr/local/etc/nginx/ssl/phpmyadmin.crt
```
### 下载资源
- 下载nginx:<http://nginx.org/download/nginx-1.6.0.tar.gz>
- 下载MySQL:<http://cdn.mysql.com/Downloads/MySQL-5.6/mysql-5.6.19.tar.gz>
- 下载php:<http://cn2.php.net/distributions/php-5.5.14.tar.gz>
- 下载pcre (支持nginx伪静态):<ftp://ftp.csx.cam.ac.uk/pub/software/programming/pcre/pcre-8.35.tar.gz>
- 下载openssl(nginx扩展):<http://www.openssl.org/source/openssl-1.0.1h.tar.gz>
- 下载zlib(nginx扩展):<http://zlib.net/zlib-1.2.8.tar.gz>
- 下载cmake(MySQL编译工具):<http://www.cmake.org/files/v2.8/cmake-2.8.11.2.tar.gz>
- 下载libmcrypt(php扩展):<http://nchc.dl.sourceforge.net/project/mcrypt/Libmcrypt/2.5.8/libmcrypt-2.5.8.tar.gz>
- 下载yasm(php扩展):<http://www.tortall.net/projects/yasm/releases/yasm-1.2.0.tar.gz>:
- t1lib(php扩展):<ftp://sunsite.unc.edu/pub/Linux/libs/graphics/t1lib-5.1.2.tar.gz>:
- 下载gd库安装包:<https://bitbucket.org/libgd/gd-libgd/downloads/libgd-2.1.0.tar.gz>
- libvpx(gd库需要):<https://webm.googlecode.com/files/libvpx-v1.3.0.tar.bz2>:
- tiff(gd库需要):<http://download.osgeo.org/libtiff/tiff-4.0.3.tar.gz>:
- libpng(gd库需要):<ftp://ftp.simplesystems.org/pub/png/src/libpng16/libpng-1.6.12.tar.gz>
- freetype(gd库需要):<http://ring.u-toyama.ac.jp/archives/graphics/freetype/freetype2/freetype-2.5.3.tar.gz>
- jpegsrc(gd库需要):<http://www.ijg.org/files/jpegsrc.v9a.tar.gz>
## 参考
- [Mac OS X LEMP Configuration](https://gist.github.com/petemcw/9265670)
- [oneinstack](https://github.com/lj2007331/oneinstack):OneinStack - A PHP/JAVA Deployment Tool <https://oneinstack.com/>
- [devilbox](https://github.com/cytopia/devilbox):A modern dockerized LAMP and MEAN stack alternative to XAMPP <http://devilbox.org>
- [lamp](https://github.com/teddysun/lamp):Install LAMP(Linux + Apache + MySQL/MariaDB/Percona Server + PHP ) for CentOS/Fedora/Debian/Ubuntu <https://lamp.sh>
| 33.074294 | 768 | 0.735253 | kor_Hang | 0.265463 |
3bef4830a04ac2b4cf9a06eac3276cc2fbd93671 | 6,828 | md | Markdown | szyfr/readme.md | 42n4/eulinks | 36751af704e576a2144e109b9513bf674453c63a | [
"MIT"
] | 1 | 2020-03-02T11:20:00.000Z | 2020-03-02T11:20:00.000Z | szyfr/readme.md | 42n4/eulinks | 36751af704e576a2144e109b9513bf674453c63a | [
"MIT"
] | null | null | null | szyfr/readme.md | 42n4/eulinks | 36751af704e576a2144e109b9513bf674453c63a | [
"MIT"
] | 4 | 2017-04-30T16:12:48.000Z | 2017-07-03T15:20:25.000Z | ### Open Source
- https://blog.dan.drown.org/gnuk-open-source-gpg-ssh-hardware-key-storage/
- https://shop.nitrokey.com/shop/product/nk-sta-nitrokey-start-6
### Security
- https://niebezpiecznik.pl/post/keepass-jak-zaczac-swoja-przygode-z-managerem-hasel/
- https://zaufanatrzeciastrona.pl/post/yubikey-dobry-na-wszystko-czyli-sprzetowe-wsparcie-logowania-po-ssh/
- https://www.freeipa.org/page/Using_Yubikey_4_Nano_to_authenticate_to_FreeIPA_enrolled_host
- http://www.keepassdroid.com/
- https://keepassxc.org/
- https://github.com/keepassxreboot/keepassxc/blob/develop/docs/QUICKSTART.md#using-sharing
- https://www.hanselman.com/blog/HowToSetupSignedGitCommitsWithAYubiKeyNEOAndGPGAndKeybaseOnWindows.aspx
- https://github.com/drduh/YubiKey-Guide
- https://help.ubuntu.com/community/GnuPrivacyGuardHowto
- https://riseup.net/en/security/message-security/openpgp/best-practices
- https://www.jabberwocky.com/software/paperkey/
- https://ttmm.io/tech/yubikey/
- https://www.devever.net/~hl/smartcards
### Yubikey
- https://nickmooney.com/keybase-pgp-yubikey/
- https://www.emsec.rub.de/media/crypto/veroeffentlichungen/2014/02/04/paper_yubikey_sca.pdf
- https://laptrinhx.com/guide-to-setup-a-yubikey-for-fedora-kde-as-2fa-using-u2f-for-the-sddm-login-screen-lock-screen-sudo-and-su-2686335012/
### Cryptsetup
- https://we.riseup.net/riseuphelp+en/disk-encryption-kde
- https://security.stackexchange.com/questions/5158/for-luks-the-most-preferable-and-safest-cipher
- https://superuser.com/questions/775200/how-do-i-determine-what-ciphers-cipher-modes-i-can-use-in-dm-crypt-luks/1407458#1407458
- https://wiki.gentoo.org/wiki/Dm-crypt_full_disk_encryption
- https://wiki.archlinux.org/index.php/dm-crypt/Device_encryption
### ATTINY25 & key loggers
- https://sekurak.pl/pendrive-przejmujacy-komputer-za-30-zlotych/
- http://www.fourwalledcubicle.com/LUFA.php
- https://hackaday.io/project/17598-diy-usb-rubber-ducky
- https://github.com/hak5darren/USB-Rubber-Ducky/wiki/Payloads
- https://www.immunity-systems.com/hidude_doc/
- https://github.com/immunity-systems/hidude
- [Adafruit Trinket](https://www.adafruit.com/product/1501)
- https://medium.com/@EatonChips/building-a-usb-rubber-ducky-for-7-c851aae30a1d
- https://github.com/beargun/Adafruit-Trinket-Gemma-Bootloader
- https://shop.pimoroni.com/products/zero-stem-usb-otg-connector
- https://maltronics.com/collections/malduinos
- https://shop.hak5.org/products/usb-rubber-ducky-deluxe
- https://maker.pro/arduino/projects/how-to-build-a-rubber-ducky-usb-with-arduino-using-a-digispark-module
- https://pl.aliexpress.com/item/Free-shipping-1pcs-Digispark-kickstarter-development-board-ATTINY85-module-for-Arduino-usb/32697283942.html
- https://pl.aliexpress.com/item/ATTINY88-micro-development-board-16-Mhz-Digispark-ATTINY85-zmodernizowane-NANO-V3-0-ATmega328-rozszerzona-kompatybilny/32995174990.html
- https://rufus.ie/
- https://github.com/spacehuhn/wifi_keylogger
### TABLICE TĘCZOWE
* https://www.freerainbowtables.com/articles/introduction_to_rainbow_tables/
* http://project-rainbowcrack.com/table.htm
### Strony z kluczami two phase U2F i innymi
- https://twofactorauth.org
- https://www.dongleauth.info/dongles/
- https://github.com/hillbrad/U2FReviews
- https://onlykey.io/
- https://nelenkov.blogspot.com/2014/03/unlocking-android-using-otp.html
- https://techfreak.pl/yubikey-klucz-do-bezpieczenstwa/
- https://www.cnet.com/how-to/how-to-move-google-authenticator-to-a-new-device/
- https://www.sigilance.com/
- https://www.nitrokey.com/
- https://github.com/hillbrad/U2FReviews
- http://shop.fidesmo.com/products
- https://twofactorauth.org/
### MD5 SHA1
- https://blog.jfo.click/advent-of-learning-about-md5-and-opencl/
- https://www.mscs.dal.ca/~selinger/md5collision/
- https://blog.thireus.com/cracking-story-how-i-cracked-over-122-million-sha1-and-md5-hashed-passwords/
- https://www.question-defense.com/2010/08/15/automated-password-cracking-use-oclhashcat-to-launch-a-fingerprint-attack
### Podstawy blockchaina i kryptowaluty
- https://www.youtube.com/watch?v=SSo_EIwHSd4
- https://medium.com/@iwetalaskowska/darmowe-materia%C5%82y-do-nauki-o-kryptowalutach-oraz-blockchainie-3d252d0b2dfb
- https://www.reddit.com/r/BlockchainSchool/comments/aewk4a/material_to_study/
- https://blog.goodaudience.com/how-a-miner-adds-transactions-to-the-blockchain-in-seven-steps-856053271476
- https://medium.com/coinmonks/what-is-a-51-attack-or-double-spend-attack-aa108db63474
- https://medium.com/@iwetalaskowska/skalowalno%C5%9B%C4%87-blockchaina-zrozumie%C4%87-ethereum-2-o-d2db17125bcf
- http://www.sages.com.pl/blog/temat/blockchain/
- https://github.com/ethereum/wiki/wiki/Proof-of-Stake-FAQ
- https://www.youtube.com/watch?v=pzIl3vmEytY
### Zastosowania blockchaina
- https://medium.com/@iwetalaskowska/hyperledger-fabric-w-pigu%C5%82ce-e01e7d3a3e2d
- https://hyperledger-fabric.readthedocs.io/
- https://filecoin.io/ #storage
- https://www.iota.org/ #graphchains Tangle
- https://medium.com/@iwetalaskowska/internet-rzeczy-iot-czyli-zastosowanie-kryptowaluty-iota-e7df044f61f3
- https://cryptozombies.io
- https://truffleframework.com/
- http://etherscripter.com
### Teoria blockchaina
- https://blog.ethereum.org/2014/11/25/proof-stake-learned-love-weak-subjectivity/
### Zabawa w kopanie słabych zagrożonych włamaniem 51% walut
- https://themoneymongers.com/best-cryptocurrency-coin-to-mine-with-gpu/
- https://www.cryptunit.com/coin/RYO
- https://coinpaprika.com/pl/
- https://www.cryptomiso.com/
- https://www.coingecko.com/pl
- https://trezor.io/coins/
- https://github.com/ryo-currency/ryo-wallet
- https://github.com/fireice-uk/xmr-stak
- https://www.bestvpn.com/ethereum-mining/
- https://etherscan.io/ether-mining-calculator
- https://getnerva.org/
- https://getmasari.org/index.html
- https://github.com/turtlecoin/turtle-wallet-electron
- https://turtlecoin.lol/
- http://www.truechain.pro/en/
- https://github.com/nanopool
- https://bitcointalk.org/index.php?topic=1424132.0
- https://1stminingrig.com/best-mining-hardware/
### Proste przykłady blockchaina
- https://hackernoon.com/a-cryptocurrency-implementation-in-less-than-1500-lines-of-code-d3812bedb25c
- https://lhartikk.github.io/
- https://github.com/lhartikk/naivecoin
- https://github.com/lucrussell/tiny-blockchain
- https://medium.com/crypto-currently/lets-build-the-tiniest-blockchain-e70965a248b
- https://github.com/Viraj3f/SimpleCoin
- https://github.com/cosme12/SimpleCoin
- https://medium.com/coinmonks/implementing-proof-of-stake-e26fa5fb8716
- https://medium.com/coinmonks/implementing-blockchain-and-cryptocurrency-with-pow-consensus-algorithm-part-1-545fb32be0c2
### Online key wallets
- https://www.reddit.com/r/homelab/comments/7yefth/what_to_replace_lastpass_with/?sort=new
| 51.338346 | 169 | 0.775483 | yue_Hant | 0.741254 |
3befd7290cbd3a79ad39da848d95ea0c91012ef2 | 2,330 | md | Markdown | README.md | slepox/mongoose-route | 1a59cb38a733a7a15c597f91a9a7c1d65590a0e8 | [
"MIT"
] | 3 | 2016-06-04T01:26:21.000Z | 2016-06-17T03:22:59.000Z | README.md | slepox/mongoose-route | 1a59cb38a733a7a15c597f91a9a7c1d65590a0e8 | [
"MIT"
] | null | null | null | README.md | slepox/mongoose-route | 1a59cb38a733a7a15c597f91a9a7c1d65590a0e8 | [
"MIT"
] | null | null | null | MERA
====
**M**ongoose and **E**xpress built **R**ESTful **A**PI
# Output
As you build a REST API with mera, you get APIs:
- GET /?querya=xx&queryb=xx&start_time=xx&end_time=xx&_filter=xx&_page=x&_perPage=x
- Fully supported List
- Any allowed field can be added as a query field for filtering
- JSON string can passed in by _filter, if you prefer
- Use start_time and end_time to filter a time range if timeFilter is set
- Use _page and _perPage for pagination, X-Total-Count in response head for total count
- Use _sortDir (ASC|DESC) and _sortField to sort any allowed field
- Use format=<xxx> to export a certain format, now supports
- csv
- xlsx
- json (default)
- GET /:id
- Get by ID, simply enough
- POST /
- Use JSON body to create a new doc; any unknown field or not allowed field will be omitted
- PUT /:id
- Use JSON body to update a doc, similar
- DELETE /:id
- Delete by ID
# Samples
```javascript
// Create a mongoose model as usual
var mongoose = require('mongoose'), Schema = mongoose.Schema;
var personSchema = new Schema({
firstName: String,
lastName: String
});
var Person = mongoose.model('Person', personSchema);
// Create the router with the model
var mera = require('mera');
var router = mera(Person, {
props: // the properties to be filter at listing, creating or updating
propsMapping: // [api prop]: [model prop], if any to be mapped, id: '_id' is always added
baseFilter: // used at listing
defaultSort: // used at listing
protects: // { LIST: function(req, cb), GET: , PUT: , ... } // to protect a certain method
_id: // String, used as { options._id: req.params.id } when /:id is passed in
omitProps: // [String], list of all props to omit at output, 'output' is always omitted
uploadProps: // {'upload_file_prop': 'prop_to_be_replaced', }
timeFilter: // 'field_name', to use this field for time search
textFields: [String] // field to be search as text, which means using $regex operator
});
app.use('/persons', router); // use in app, all RESTful APIs are available.
```
# and Frontend?
This API is a perfect companion of [ng-admin](http://ng-admin-book.marmelab.com/), an Angular based management UI.
You should be able to satify 99% of your management requirements in 1 hour for a model if your db is ready.
| 35.30303 | 114 | 0.696137 | eng_Latn | 0.980298 |
3bf0eb370a4b5553388eabbecc542a97cb5c23c4 | 960 | md | Markdown | README.md | eyobtamir-401n16/lab-06 | d06cd4778b806c41bd407f2dadeb65c0ac96d4fd | [
"MIT"
] | null | null | null | README.md | eyobtamir-401n16/lab-06 | d06cd4778b806c41bd407f2dadeb65c0ac96d4fd | [
"MIT"
] | null | null | null | README.md | eyobtamir-401n16/lab-06 | d06cd4778b806c41bd407f2dadeb65c0ac96d4fd | [
"MIT"
] | 1 | 2020-09-13T11:08:25.000Z | 2020-09-13T11:08:25.000Z | # lab-06
*LAB - Class 06*
### Author:
**Eyob Tamir**
### Links and Resources
[swagger-link](https://app.swaggerhub.com/apis-docs/Eyob1984/lab-06/0.1)
### submission PR
[PR](https://github.com/eyobtamir-401n16/lab-06/pull/1)
### ci/cd (GitHub Actions)
### back-end server url (when applicable)
*http//localhost:3000*
### front-end application (when applicable)
Setup
**NOT aplicable**
### .env requirements (where applicable)
i.e.
**NOT aplicable**
### PORT - Port Number
**3000**
### MONGODB_URI - URL to the running mongo instance/db
**NOT aplicable**
### How to initialize/run your application (where applicable)
**e.g. json-server --watch ./data/db.json**
### How to use your library (where applicable)
### Tests
* How do you run tests? **Not aplicable**
* Any tests of note? **Not aplicable**
* Describe any tests that you did not complete, skipped, etc **Not aplicable**
### UML
![lab-06-UML](asset/lab-06-UML.jpg)
| 22.325581 | 78 | 0.664583 | eng_Latn | 0.644016 |
3bf268f6f94a74796fc02e5283e17a1767cf5e0b | 2,677 | md | Markdown | README.md | ezYakaEagle442/azure-service-operator | 2ef4852291606a7e286a9494888d17e9e6a1824e | [
"MIT"
] | null | null | null | README.md | ezYakaEagle442/azure-service-operator | 2ef4852291606a7e286a9494888d17e9e6a1824e | [
"MIT"
] | null | null | null | README.md | ezYakaEagle442/azure-service-operator | 2ef4852291606a7e286a9494888d17e9e6a1824e | [
"MIT"
] | null | null | null | # Azure Service Operator Demo
The [Azure Service Operator](https://github.com/Azure/azure-service-operator) is an open source [Kubernetes Operator](https://operatorhub.io/operator/azure-service-operator) that can be installed and run on most current Kubernetes distributions, including [OpenShift Container Platform](https://www.openshift.com), and the fully managed [Azure Red Hat OpenShift](https://azure.microsoft.com/en-us/services/openshift/) service.
![Azure Services in the OpenShift Developer Catalog](docs/images/azure-catalog.png "Azure Services in the OpenShift Developer Catalog")
For this demo, it doesn't matter if you use an instance of *Azure Red Hat OpenShift*, or if you install your own self-managed [OpenShift](https://www.openshift.com/try) cluster on Azure, the instructions and results are the same. After all, Azure Red Hat OpenShift is simply a fully managed instance of OpenShift that comes with a [99.9% financially-backed SLA](https://azure.microsoft.com/en-au/support/legal/sla/openshift/v1_0/).
## Demo Agenda
In this demo we will:
* Install the Azure Service Operator in our OpenShift cluster.
* Build a container image containing a Spring Boot app that connects to a MySQL database.
* Demonstrate a simple "source-to-image" build process to build a container image simply by referencing a git repository with the application source code.
* Deploy this application in a "dev" namespace that uses a MySQL container image. This will demonstrate:
* How to run the container image that we just built.
* How quick and easy it is to spin-up a MySQL container image to use in a dev/test environment.
* How the MySQL connection information (including url, username, and password) are injected into the container at run-time to properly externalize environment-specific configuration.
* Deploy this application in a "prod" namespace that uses an [Azure Database for MySQL](https://azure.microsoft.com/en-ca/services/mysql/) instance that is provisioned by OpenShfit! This will demonstrate:
* How to provision a native Azure service (in this case, a managed MySQL instance) directly from OpenShift.
* The benefits of "configuration as code", and how it can be applied with GitOps practices.
* How the MySQL connection information (including url, username, and password) are injected into the container at run-time to properly externalize environment-specific configuration. This time, this information will come from a Kubernetes **Secret** that was created by the Azure Service Operator once the database was provisioned.
Let's get started!
[Step 1: Install the Azure Service Operator](docs/01-install-operator.md)
| 95.607143 | 432 | 0.78446 | eng_Latn | 0.989413 |
3bf27fb770d2266d9f5ec256d8b9e5d7580a4ffe | 177 | md | Markdown | tutorials/README.md | msgi/nlp-tour | ffed8c32da69c2427c92a7043f47bfc91e7feb64 | [
"Apache-2.0"
] | 1,559 | 2019-05-27T03:43:29.000Z | 2022-03-31T05:35:04.000Z | tutorials/README.md | msgi/nlp-tour | ffed8c32da69c2427c92a7043f47bfc91e7feb64 | [
"Apache-2.0"
] | 5 | 2019-07-10T11:55:05.000Z | 2020-05-08T12:01:31.000Z | tutorials/README.md | msgi/nlp-tour | ffed8c32da69c2427c92a7043f47bfc91e7feb64 | [
"Apache-2.0"
] | 403 | 2019-06-14T03:36:17.000Z | 2022-03-30T08:09:08.000Z | # tutorials
* [01. Game sales predict](01.game_sales_predict/)
* [02. Bert sentiment classification](02.bert-sentiment-classification/)
* [03. GPT2 chat bot](03.gpt2_chatbot/)
| 29.5 | 72 | 0.745763 | oci_Latn | 0.146965 |
3bf42727d9de61be71d851b106d182679b58d8e8 | 475 | md | Markdown | .github/ISSUE_TEMPLATE/new-trigger---event-source.md | saikrishna169/pipedream | 2e1eb35fd69bd219347739ccdb1da35c1fd9df5f | [
"MIT"
] | 2,387 | 2020-03-04T22:11:04.000Z | 2022-03-31T03:12:20.000Z | .github/ISSUE_TEMPLATE/new-trigger---event-source.md | saikrishna169/pipedream | 2e1eb35fd69bd219347739ccdb1da35c1fd9df5f | [
"MIT"
] | 2,248 | 2020-04-16T04:54:32.000Z | 2022-03-31T20:22:33.000Z | .github/ISSUE_TEMPLATE/new-trigger---event-source.md | saikrishna169/pipedream | 2e1eb35fd69bd219347739ccdb1da35c1fd9df5f | [
"MIT"
] | 1,697 | 2020-03-30T21:19:31.000Z | 2022-03-28T12:23:27.000Z | ---
name: New Trigger / Event Source
about: Need to run a workflow on a specific event, like every time a new PR is opened
on Github? Request it here!
title: "[TRIGGER]"
labels: enhancement, good first issue, help wanted, trigger / source
assignees: ''
---
**Describe the event source. What app is this for, and what event does the trigger correspond to?**
**Please provide a link to the relevant API docs for the specific service / operation this trigger is tied to.**
| 33.928571 | 112 | 0.736842 | eng_Latn | 0.998678 |
3bf48f35a7734a3986b7613d8e8d4a6ab7c3029d | 2,933 | markdown | Markdown | _posts/2016-07-16-fight-ezcema-2.markdown | flickerlight/flickerlight.github.io | 8167d2f9aaa5c3b802d54c43ed88f2deb14402be | [
"MIT"
] | null | null | null | _posts/2016-07-16-fight-ezcema-2.markdown | flickerlight/flickerlight.github.io | 8167d2f9aaa5c3b802d54c43ed88f2deb14402be | [
"MIT"
] | null | null | null | _posts/2016-07-16-fight-ezcema-2.markdown | flickerlight/flickerlight.github.io | 8167d2f9aaa5c3b802d54c43ed88f2deb14402be | [
"MIT"
] | null | null | null | ---
layout: post
title: "湿疹作战笔记(下)"
date: 2016-07-16
author: Flickerlight
category: Healthcare
---
<p align="center"><img src="/images/2016-07-16/otcdrugs.jpg"></p>
  如果湿疹严重瘙痒红肿甚至破皮流水,仅仅靠上篇提到的日常护理手段就无法解决了,需要就医用药。对抗严重湿疹时医生可能开出抗过敏药物减轻瘙痒,抗生素对付皮肤破损感染,以及对湿疹治疗最重要的也被误解最多的:外用皮质类固醇激素药物。
  见“激素”二字就忐忑绝对不算中国特色。且看美国湿疹协会(NEA)也需要用长篇大论来解释外用皮质类固醇激素药物治疗湿疹是安全有效的,就知道对“激素”的恐惧实在是世界大同。
  实际上,只有长期大剂量口服类固醇激素才需要担心对生长发育有影响。外用类固醇激素只有在长期使用时才可能有皮肤局部变薄或者色素沉着的副作用,且随着时间推移也是会恢复的。令人闻之色变的“激素依赖性皮炎”也是长期大剂量用强效激素才会导致。小规模短期外用中低强度激素完全无需担心。这方面已经有很多医生做过大量科普,比如崔玉涛、冀连梅、杨希川等,微信微博一搜就有,这里不再赘述。
  想分享的主要是湿疹日常药物(主要是类固醇激素)使用的几个注意点:
### 在开始使用药物之前,一定要经专业皮肤科医生确诊
  群里经常会遇到有妈妈抛出一张照片问怎么办,其他人看一下觉得像是湿疹就提出各种建议,然后就迅速进入怎么购药用药的讨论去了。湿疹虽然有它的一些典型症状,但有时仍不是家长能自行判断的。去一趟医院请医生确诊,之后的治疗和护理才心里有谱。
### 在能控制湿疹前提下,使用较弱的类固醇激素药物
  0.1%丁酸氢化可的松(尤卓尔)是能自行购买的非处方皮质类固醇激素中最弱的,FDA批准婴儿3个月以上就可以使用了。有些医院可能有自制的更弱的激素药膏,也是可以安全使用的。如果医生认为有必要使用更强的激素或结合使用其他药物(抗过敏药物、抗生素或免疫制剂),请遵医嘱。
<p align="center"><img src="/images/2016-07-16/youzhuoer.jpg"></p>
  附上FDA批准的儿童可用的皮质类固醇激素药物清单供参考(绿色的几个是国内儿童湿疹的常用药。没有国内常见名的是我还没找到国内的对应版本):
<p align="center"><img src="/images/2016-07-16/torpical_potency.jpg"></p>
### 参考“指尖规则”确定类固醇药物用量
  激素药膏该涂多少?这也曾经是一个困扰过我的问题。用少了怕效果不好,用多了总有点心虚。NEA提到了一个很好的定量方法,即以成人食指第一指节长度来度量药膏(称为一个“指尖单位”,差不多是0.5克),如下图:
<p align="center"><img src="/images/2016-07-16/finger_rule.jpg"></p>
  **一指尖单位的药膏适用成人两个手掌(包括手指)那么大的皮肤面积。**
  美国家庭医师学会(AAFP)则给出了如下用量表:
<p align="center"><img src="/images/2016-07-16/Fingerunits.jpg"></p>
  **每天最多使用两次,很多时候一次就足够了**。在湿疹部位薄薄地涂一层药膏,涂完等一小会之后接着涂保湿。由于儿童皮肤薄,对类固醇激素的吸收比成人多,**如果是超过全身皮肤面积1/3的大面积湿疹,用量请严格遵医嘱。**
### 湿疹部位有皮损时,不要使用普通护肤品,遵医嘱使用抗生素
  这个很好理解,皮损表示有开放伤口,此时使用护肤品一方面可能刺激破损的皮肤产生更多的炎症反应,另一方面一些成分经过伤口进入血液也有可能造成过敏。
  有时医院会开出氧化锌软膏对有轻微渗液的皮损部位进行保护,这是安全的。但也要注意氧化锌软膏不适合湿疹患者长期大面积使用,皮损恢复以后应该换回平时的保湿品。
  皮损如果继发了感染可能要考虑同时使用抗生素(如百多邦外用)。虽然百多邦是一个非处方药,但仍建议遵医嘱,不要看到皮肤有破口就随意使用。
### 谨慎使用炉甘石洗剂止痒
  虽说是主要讨论类固醇激素药物,不过炉甘石洗剂这个神药还是要提一下。涂痱子、涂蚊虫叮咬的包、涂荨麻疹……炉甘石洗剂着实是止痒的一把好手。而湿疹的一个典型症状也是痒,于是很多医生也会开出炉甘石洗剂给湿疹患者止痒。
  然而对湿疹来说,炉甘石洗剂是把双刃剑。其主要成分炉甘石和氧化锌都有吸水和收敛的能力,会使得皮肤变干,更雪上加霜的是洗剂这个剂型的保湿能力又比较差。而湿疹最怕皮肤干燥。为了止一时之痒而导致皮肤干燥最终又加重湿疹,有点得不偿失。
  因此我的建议是,如果炉甘石对止痒确有效果,可以用,但之后一定要做好保湿。
### 中药洗剂?偏方?你在和我开玩笑吗?
  熟悉我的人应该都知道我的中医黑属性。这节不展开讨论,仅仅用来表明态度。轻微的湿疹,做好保湿足矣。重度湿疹就老老实实看正经医生用正经药,别偏方洗剂的瞎折腾。
  顺便吐个槽,正经医生有时也不那么正经。新华医院的医生曾经手一抖给我开了500块的中药洗剂,而蓝白膏加起来才5块钱。我回去找她退时她还振振有词的说“就是中药有效怎么能不用”。然后仔细一看药单发现自己着实开得有点多,默默盖章全退。那中药洗剂,大概对创收最有用。
----------
###参考来源
---
[1] NEA网站上的["关于类固醇激素药物的信息"](https://nationaleczema.org/eczema/treatment/topical-corticosteroids/)
[2] AAFP网站上的["Choosing Topical Corticosteroids"](http://www.aafp.org/afp/2009/0115/p135.html)
[3] 美国银屑病基金会(NFP)网站上的[类固醇激素强度表](https://www.psoriasis.org/about-psoriasis/treatments/topicals/steroids/potency-chart)
[4]果壳:[解决皮肤问题,氧化锌软膏并非万能](http://www.guokr.com/article/439146/) | 37.126582 | 195 | 0.80941 | yue_Hant | 0.412139 |
3bf58d615c9325cdc856318e55de8232f29c0be6 | 2,591 | md | Markdown | README.md | JuGGerNaunT/Gm1KonverterCrossPlatform | fcab4be95fcd1df56a788a7eb22c31b39f92c731 | [
"MIT"
] | null | null | null | README.md | JuGGerNaunT/Gm1KonverterCrossPlatform | fcab4be95fcd1df56a788a7eb22c31b39f92c731 | [
"MIT"
] | null | null | null | README.md | JuGGerNaunT/Gm1KonverterCrossPlatform | fcab4be95fcd1df56a788a7eb22c31b39f92c731 | [
"MIT"
] | null | null | null |
Gm1KonverterCrossPlatform
=======================
A tool to convert strongholds gm1 files to png and png to GM1 Files.
English:
---------
Hello Guys,
i started to programm a GM1 Exporter/Importer Programm for Windows and Linux.
If you have questions just add me on Discord: Gaaammmler#1397
For Donwloading the Programm click on the link and Download the Converter.zip on the assets symbol.
![Download](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/releases)
![img2](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/blob/master/GMConverterImages/img2.JPG)
If you first start the Programm select your Stronghold GM1 Folder and a Workfolder folder under Options, you also can select between Languages: German/Russian/English
![img1](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/blob/master/GMConverterImages/img1.JPG)
For more infos to an specific Filetype click on the Info Icon.
![img3](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/blob/master/GMConverterImages/img3.JPG)
If you want Stronghold 1 Graphics in Crusader:
![SH1 Graphics](https://github.com/Gaaammmler/Stronghold-Crusader-Sh1-Graphics)
Thanks to
![Lolasik011](https://github.com/Lolasik011) for the russian translation
![metalvoidzz](https://github.com/metalvoidzz) for his Tutorial how to decode GM1 Files
Deutsch
---------
Hallo Leute ich habe angefangen einen GM1 Exporter/Importer für Windows und Linux zu programmieren.
Falls du fragen hast adde mich auf Discord: Gaaammmler#1397
Um das Programm zu Donwloaden klicke auf den Link und Downloade die Converter.zip unterhalb dem Assets Symbol.
![Download](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/releases)
![img2](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/blob/master/GMConverterImages/img2.JPG)
Wenn du das Programm zum ersten mal startest muss unter den Optionen der GM1 Ordner und der Arbeitsordner ausgewählt werden, außerdem kann die Sprache zwischen Englisch/Deutsch/Russisch geändert werden.
![img1](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/blob/master/GMConverterImages/img1.JPG)
Für mehr informationen zu einem Dateityp klicke auf das Infoicon.
![img3](https://github.com/Gaaammmler/Gm1KonverterCrossPlatform/blob/master/GMConverterImages/img3.JPG)
Falls du Stronghold 1 Grafiken in Crusader möchtest:
![SH1 Graphics](https://github.com/Gaaammmler/Stronghold-Crusader-Sh1-Graphics)
Thanks to
![Lolasik011](https://github.com/Lolasik011) for the russian translation
![metalvoidzz](https://github.com/metalvoidzz) for his Tutorial how to decode GM1 Files
| 38.102941 | 202 | 0.798919 | kor_Hang | 0.24377 |
3bf59438d7cedcd3a75efca7b36cc34b1c860c8a | 578 | md | Markdown | README.md | Devilbinder/PIC_18F_Segment_Displays | 69959e2c0be41db6040f84c80269468ebd5cfb96 | [
"MIT"
] | 1 | 2020-06-07T22:21:40.000Z | 2020-06-07T22:21:40.000Z | README.md | Devilbinder/PIC_18F_Segment_Displays | 69959e2c0be41db6040f84c80269468ebd5cfb96 | [
"MIT"
] | null | null | null | README.md | Devilbinder/PIC_18F_Segment_Displays | 69959e2c0be41db6040f84c80269468ebd5cfb96 | [
"MIT"
] | null | null | null | # **PIC 18F Segment Displays**
This demonstrates how to setup segment LED displays with multiplexing. Using the XC8 compiler with MPLAB X and a PIC18F4520.
[![PIC Programming Tutorial #26 - 7 Segment LED Display](https://img.youtube.com/vi/ePyTYILiAzI/0.jpg)](https://www.youtube.com/watch?v=ePyTYILiAzI "PIC Programming Tutorial #26 - 7 Segment LED Display")
☕Coffee Funds☕.
Shekels:
https://www.paypal.me/bindertronics9/5
Patreon:
https://www.patreon.com/BinderTronics
Bitcoin:
19nohZzWXxVuZ9tZvw8Pvhajt5khG5mspW
Ethereum:
0x5fe29789CDaE8c73C9791bEe36c7ad5db8511D39
| 28.9 | 203 | 0.788927 | yue_Hant | 0.386341 |
3bf59f365a54b2c0cb3e853eeddc9242cbabb1d8 | 1,820 | md | Markdown | docs/AWS/Redshift.md | purescript-aws-sdk/purescript-aws-redshift | 5049e0fcc060e62903069ea929e6da82f635743c | [
"MIT"
] | null | null | null | docs/AWS/Redshift.md | purescript-aws-sdk/purescript-aws-redshift | 5049e0fcc060e62903069ea929e6da82f635743c | [
"MIT"
] | null | null | null | docs/AWS/Redshift.md | purescript-aws-sdk/purescript-aws-redshift | 5049e0fcc060e62903069ea929e6da82f635743c | [
"MIT"
] | null | null | null | ## Module AWS.Redshift
<fullname>Amazon Redshift</fullname> <p> <b>Overview</b> </p> <p>This is an interface reference for Amazon Redshift. It contains documentation for one of the programming or command line interfaces you can use to manage Amazon Redshift clusters. Note that Amazon Redshift is asynchronous, which means that some interfaces may require techniques, such as polling or asynchronous callback handlers, to determine when a command has been applied. In this reference, the parameter descriptions indicate whether a change is applied immediately, on the next instance reboot, or during the next maintenance window. For a summary of the Amazon Redshift cluster management interfaces, go to <a href="http://docs.aws.amazon.com/redshift/latest/mgmt/using-aws-sdk.html">Using the Amazon Redshift Management Interfaces</a>.</p> <p>Amazon Redshift manages all the work of setting up, operating, and scaling a data warehouse: provisioning capacity, monitoring and backing up the cluster, and applying patches and upgrades to the Amazon Redshift engine. You can focus on using your data to acquire new insights for your business and customers.</p> <p>If you are a first-time user of Amazon Redshift, we recommend that you begin by reading the <a href="http://docs.aws.amazon.com/redshift/latest/gsg/getting-started.html">Amazon Redshift Getting Started Guide</a>.</p> <p>If you are a database developer, the <a href="http://docs.aws.amazon.com/redshift/latest/dg/welcome.html">Amazon Redshift Database Developer Guide</a> explains how to design, build, query, and maintain the databases that make up your data warehouse. </p>
#### `Service`
``` purescript
newtype Service
= Service Service
```
#### `service`
``` purescript
service :: forall eff. Options -> Eff (exception :: EXCEPTION | eff) Service
```
| 95.789474 | 1,608 | 0.776374 | eng_Latn | 0.982859 |
3bf5f9308195c6dea325cd3a5bcb1b5e738d545d | 1,044 | md | Markdown | README.md | JonatronLeon/eslint-plugin-leon-require-jsdoc | 63ed35e1357c2c6ef01de687480b64f3c8b8919e | [
"MIT"
] | null | null | null | README.md | JonatronLeon/eslint-plugin-leon-require-jsdoc | 63ed35e1357c2c6ef01de687480b64f3c8b8919e | [
"MIT"
] | null | null | null | README.md | JonatronLeon/eslint-plugin-leon-require-jsdoc | 63ed35e1357c2c6ef01de687480b64f3c8b8919e | [
"MIT"
] | null | null | null | # eslint-plugin-leon-require-jsdoc
An update to the require-jsdoc that catches more function expressions and declarations.
## Installation
You'll first need to install [ESLint](http://eslint.org):
```
$ npm i eslint --save-dev
```
Next, install `eslint-plugin-leon-require-jsdoc`:
```
$ npm install eslint-plugin-leon-require-jsdoc --save-dev
```
**Note:** If you installed ESLint globally (using the `-g` flag) then you must also install `eslint-plugin-leon-require-jsdoc` globally.
## Usage
Add `leon-require-jsdoc` to the plugins section of your `.eslintrc` configuration file. You can omit the `eslint-plugin-` prefix:
```json
{
"plugins": [
"leon-require-jsdoc"
],
"rules": {
//...
"leon-require-jsdoc/leon-require-jsdoc": ["error", {
"require": {
"FunctionDeclaration": true,
"MethodDefinition": true,
"ClassDeclaration": true,
"ArrowFunctionExpression": true
}
}]
//...
}
}
```
| 21.306122 | 136 | 0.604406 | eng_Latn | 0.627018 |
3bf66fec10ccc460b2dd4a60bc441a2464f57e5f | 933 | md | Markdown | content/events/2016-dallas/program/marissa-lerer.md | devopsdays/devopsdays-test | e2766c9a61cf053c7949d37fe420e8750b4549dc | [
"Apache-2.0",
"MIT"
] | null | null | null | content/events/2016-dallas/program/marissa-lerer.md | devopsdays/devopsdays-test | e2766c9a61cf053c7949d37fe420e8750b4549dc | [
"Apache-2.0",
"MIT"
] | null | null | null | content/events/2016-dallas/program/marissa-lerer.md | devopsdays/devopsdays-test | e2766c9a61cf053c7949d37fe420e8750b4549dc | [
"Apache-2.0",
"MIT"
] | null | null | null | +++
City = "Dallas"
Year = "2016"
date = "2016-09-15T09:00:00-08:00"
title = "Marissa Levy Lerer"
type = "talk"
+++
**Title:** Where My Ladies At: The Radical Topic of Women in Tech
**Description:**
There’s no disputing that the future has arrived for pocket computers, autonomous cars and croissants that are also, somehow, donuts. Innovation is moving at light-speed but women are being left behind when it comes to tech. We still having the same, tired conversation about (say it with me) “women in tech.” We are inundated with jargon about leaning in, opting out, work-life balance, which can be discouraging for women, especially for those beginning their careers. This presenter is a CTO, mom, and proud chick coder understands what it’s like to survive the “woman in tech” scene. Join her for an empowering journey of where we’ve been, where we’re heading and some unsolicited advice about making the most of your career. | 71.769231 | 730 | 0.760986 | eng_Latn | 0.999443 |
3bf8f0fb16be4a85262bcac2d325726e951cf302 | 812 | md | Markdown | _posts/2017-11-11-Moonlight-Style-MT9183.md | nicedaymore/nicedaymore.github.io | 4328715a75c752dd765c77f1bdad68267ad04b61 | [
"MIT"
] | null | null | null | _posts/2017-11-11-Moonlight-Style-MT9183.md | nicedaymore/nicedaymore.github.io | 4328715a75c752dd765c77f1bdad68267ad04b61 | [
"MIT"
] | null | null | null | _posts/2017-11-11-Moonlight-Style-MT9183.md | nicedaymore/nicedaymore.github.io | 4328715a75c752dd765c77f1bdad68267ad04b61 | [
"MIT"
] | null | null | null | ---
layout: post
date: 2017-11-11
title: "Moonlight Style MT9183"
category: Moonlight
tags: [Moonlight]
---
### Moonlight Style MT9183
Just **$199.99**
###
<table><tr><td>BRANDS</td><td>Moonlight</td></tr></table>
<a href="https://www.readybrides.com/en/moonlight/48299-moonlight-style-mt9183.html"><img src="//img.readybrides.com/106897/moonlight-style-mt9183.jpg" alt="Moonlight Style MT9183" style="width:100%;" /></a>
<!-- break --><a href="https://www.readybrides.com/en/moonlight/48299-moonlight-style-mt9183.html"><img src="//img.readybrides.com/106896/moonlight-style-mt9183.jpg" alt="Moonlight Style MT9183" style="width:100%;" /></a>
Buy it: [https://www.readybrides.com/en/moonlight/48299-moonlight-style-mt9183.html](https://www.readybrides.com/en/moonlight/48299-moonlight-style-mt9183.html)
| 50.75 | 221 | 0.725369 | yue_Hant | 0.544029 |
3bf91ae9abea362c30e2dd4236edf450d997f140 | 2,997 | md | Markdown | BigFront-end/References.md | yogkin/Programming-Notes | bc7f0b7e37e453a19cd0ebea3a47c501931b1547 | [
"Apache-2.0"
] | 1 | 2020-10-31T07:15:53.000Z | 2020-10-31T07:15:53.000Z | BigFront-end/References.md | yogkin/Programming-Notes | bc7f0b7e37e453a19cd0ebea3a47c501931b1547 | [
"Apache-2.0"
] | null | null | null | BigFront-end/References.md | yogkin/Programming-Notes | bc7f0b7e37e453a19cd0ebea3a47c501931b1547 | [
"Apache-2.0"
] | null | null | null | ## TODO
- nodejs
- GraphQL
- PWA
- WebAssembly
## Reference
PWD:
- [第一本 PWA 中文书](https://github.com/SangKa/PWA-Book-CN)
Flutter:
- [flutter github](https://github.com/flutter/flutter)
- [flutter专栏教程](http://blog.csdn.net/column/details/13593.html)
- [flutter-study](https://github.com/yang7229693/flutter-study)
- [如何评价 Google 的 Fuchsia、Android、iOS 跨平台应用框架 Flutter?](https://www.zhihu.com/question/50156415)
- [开发工具总结(10)之Flutter从配置安装到填坑指南详解](https://www.jianshu.com/p/399c01657920)
- [flutter cn](https://flutter-io.cn/)
- [Awesome Flutter:带你从入门到进阶的 Flutter 指南](https://juejin.im/post/5b2869e66fb9a00e5f3e861f)
- [Flutter的原理及美团的实践](https://mp.weixin.qq.com/s?__biz=MjM5NjQ5MTI5OA==&mid=2651748565&idx=1&sn=f92ce52627b680529c3c31e393779168&chksm=bd12a1988a65288eec838dbe64a31990f64baff2093f85ba8c75f581fcd5883947867d7a20a0&mpshare=1&scene=1&srcid=08095QIv3usd64vN4liBBi1c#rd)
- [Kotlin + MVP + Flutter ,让你可以在自己的项目中集成 Flutter 并使用](https://juejin.im/post/5b7cf52e51882542c963f0f1)
小程序:
- [如何入门微信小程序开发,有哪些学习资料?](https://www.zhihu.com/question/50907897)
- [一名Android开发者的微信小程序填坑之路(1)](http://blog.csdn.net/luoyanglizi/article/details/52681245)
- [微信小应用资源汇总整理](https://github.com/lypeer/awesome-wechat-weapp)
- [微信小程序开发实战课程之油耗计算器](http://edu.csdn.net/course/detail/3839)
- [微信小程序征服指南](http://weapp.masterstudio.tech/guide/guide_article.html)
WebRTC:
- [官网](https://webrtc.org/start/)
- [WebRTC介绍](https://github.com/ChenYilong/WebRTC)
- [即时通讯网](http://www.52im.net/thread-50-1-1.html)
- [开箱即用的 WebRTC 开发环境](https://blog.piasy.com/2017/06/17/out-of-the-box-webrtc-dev-env/)
- [苹果终于入伙 WebRTC,新一代移动 Web 应用爆发路上还有哪些坑?](https://banburytang.github.io/2017/06/webrtc/)
RN:
- [写给移动开发者的 React Native 指南](http://www.jianshu.com/p/b88944250b25)
- [react-native](https://facebook.github.io/react-native/)
- [Android开发技术周报特刊之React Native](http://androidweekly.cn/android-dev-special-weekly-react-native/)
- [react-native 系列资源-江清清](http://www.lcode.org/react-native/)
- [一次RN跨平台开发之旅GitFeed](http://xiekw2010.github.io/2016/02/11/rngitfeed)
- [构建 F8 2016 App](http://f8-app.liaohuqiu.net/tutorials/building-the-f8-app/planning/)
- <a href="https://mp.weixin.qq.com/s?__biz=MzI1MTA1MzM2Nw==&mid=2649796767&idx=1&sn=9a499453b627a223e0c2863658dd0329&scene=0&key=b28b03434249256be926ee356516850b2069dc5d3b518f346e46c35ede8e6372e0d30db16263cd591cb592f8c08ff2e1&ascene=0&uin=MjAyNzY1NTU%3D&devicetype=iMac+MacBookPro12%2C1+OSX+OSX+10.11.3+build(15D21)&version=11020201&pass_ticket=05otuT9MFSWXO43Jv%2FenDGRq0%2Fe5PIh10dUcIP%2BNntg%3D">【ReactNative For Android】框架启动核心路径剖析</a>
- [react-native-lesson-资料大全](https://github.com/vczero/react-native-lesson)
- [React Native专题](http://www.lcode.org/react-native/)
[React Native周报](http://www.lcode.org/category/react-native-zong/react-native%E6%8A%80%E6%9C%AF%E5%91%A8%E6%8A%A5/)
- [React-Native学习指南](https://github.com/reactnativecn/react-native-guide)
- [React Native中文网](http://reactnative.cn/)
- [react-native-Gank](https://github.com/wangdicoder/react-native-Gank)
| 51.672414 | 439 | 0.77344 | yue_Hant | 0.60985 |
3bfa83c756dd4a26ffe688dd4290a0139abb26e9 | 39 | md | Markdown | README.md | ic2hrmk/assync_nmap | 8845f491fbbdf4a84c85804acaa1c189ccd888ce | [
"MIT"
] | null | null | null | README.md | ic2hrmk/assync_nmap | 8845f491fbbdf4a84c85804acaa1c189ccd888ce | [
"MIT"
] | null | null | null | README.md | ic2hrmk/assync_nmap | 8845f491fbbdf4a84c85804acaa1c189ccd888ce | [
"MIT"
] | null | null | null | # assync_nmap
Multithread nmap in ruby
| 13 | 24 | 0.820513 | eng_Latn | 0.828445 |
3bfb6d68a0abf524907f35beb9269c75e2292f1a | 655 | md | Markdown | README.md | alexanderseo/shopminiyii2 | 05bcca9f6df5d1f1cec79a2fec4b2f88b3dcec34 | [
"BSD-3-Clause"
] | null | null | null | README.md | alexanderseo/shopminiyii2 | 05bcca9f6df5d1f1cec79a2fec4b2f88b3dcec34 | [
"BSD-3-Clause"
] | null | null | null | README.md | alexanderseo/shopminiyii2 | 05bcca9f6df5d1f1cec79a2fec4b2f88b3dcec34 | [
"BSD-3-Clause"
] | null | null | null | <h1>PHP – тестовое задание</h1>
Реализовать на Yii2 инструменты работы с товарами и заказами с минимальным необходимым интерфейсом. Спроектировать для этого MySQL базу данных с необходимыми таблицами и связями между ними.
<strong>Для товаров:</strong>
<ul>
<li>Создание товара (название, цена, кол-во и т.д.)</li>
<li>Редактирование товара</li>
<li>Просмотр списка товаров и их поиск/фильтр</li>
</ul>
<strong>Для заказов:</strong>
<ul>
<li>Создание заказа</li>
<li>Редактирование заказа</li>
<li>Добавление и удаление доступных товаров</li>
<li>Смена статуса </li>
<li>Просмотр списка заказов и их поиск/фильтр (по статусу, по номеру и т.д.)</li>
</ul> | 36.388889 | 189 | 0.752672 | rus_Cyrl | 0.928096 |