Skip to content

Commit 3b500e2

Browse files
authored
custom_function_double_backward_tutorial.rst ๋ฒˆ์—ญ (#1045)
* custom_function_double_backward_tutorial.rst ๋ฒˆ์—ญ * pr ๋‚ด์šฉ ๋ฐ˜์˜ํ•˜์—ฌ ์ˆ˜์ • * pr ์ˆ˜์ •์‚ฌํ•ญ ๋ฐ˜์˜: ๋„์–ด์“ฐ๊ธฐ ๊ต์ • * ์—ญ์ž ๋ถ€๋ถ„ ์ˆ˜์ • * ์—ญ์ž ๋ถ€๋ถ„ ์ˆ˜์ • * ์—ญ์ž ๋ถ€๋ถ„ ์ˆ˜์ • * ์—ญ์ž ๋ถ€๋ถ„ ์ˆ˜์ • * ์—ญ์ž ๋ถ€๋ถ„ ์ˆ˜์ • * ์—ญ์ž ๋ถ€๋ถ„ ์ˆ˜์ • * rst ๋ฌธ๋ฒ• ์ˆ˜์ • * ์—ญ์ž rst ๋ฌธ๋ฒ•์— ๋งž๊ฒŒ ์ˆ˜์ • ๋ฐ ๋„์–ด์“ฐ๊ธฐ ์ˆ˜์ •
1 parent f4dd6ea commit 3b500e2

File tree

1 file changed

+62
-70
lines changed

1 file changed

+62
-70
lines changed

โ€Žintermediate_source/custom_function_double_backward_tutorial.rstโ€Ž

Lines changed: 62 additions & 70 deletions
Original file line numberDiff line numberDiff line change
@@ -1,42 +1,40 @@
1-
Double Backward with Custom Functions
1+
์‚ฌ์šฉ์ž ์ •์˜ ํ•จ์ˆ˜์™€ ์ด์ค‘ ์—ญ์ „ํŒŒ
22
=====================================
3+
**๋ฒˆ์—ญ**: `๋ฐ•๊ฑด์ˆ˜ <https://github.com/ParkKunsu>`_
34

4-
It is sometimes useful to run backwards twice through backward graph, for
5-
example to compute higher-order gradients. It takes an understanding of
6-
autograd and some care to support double backwards, however. Functions
7-
that support performing backward a single time are not necessarily
8-
equipped to support double backward. In this tutorial we show how to
9-
write a custom autograd function that supports double backward, and
10-
point out some things to look out for.
115

6+
์—ญ์ „ํŒŒ ๊ทธ๋ž˜ํ”„๋ฅผ ํ†ตํ•ด ์—ญ์ „ํŒŒ๋ฅผ ๋‘ ๋ฒˆ ์‹คํ–‰ํ•˜๋Š” ๊ฒƒ์€ ๊ฐ€๋”์”ฉ ์œ ์šฉํ•œ ๊ฒฝ์šฐ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค.
7+
์˜ˆ๋ฅผ ๋“ค์–ด ๊ณ ์ฐจ ๋ฏธ๋ถ„์„ ๊ณ„์‚ฐํ•  ๋•Œ์ž…๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์ด์ค‘ ์—ญ์ „ํŒŒ๋ฅผ ์ง€์›ํ•˜๋ ค๋ฉด
8+
autograd์— ๋Œ€ํ•œ ์ดํ•ด์™€ ์„ธ์‹ฌํ•œ ์ฃผ์˜๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค. ๋‹จ์ผ ์—ญ์ „ํŒŒ๋ฅผ ์ง€์›ํ•œ๋‹ค๊ณ  ๋ฐ˜๋“œ์‹œ
9+
์ด์ค‘ ์—ญ์ „ํŒŒ๋ฅผ ์ง€์›ํ•˜๋Š” ๊ฒƒ์€ ์•„๋‹™๋‹ˆ๋‹ค. ์ด ํŠœํ† ๋ฆฌ์–ผ์—์„œ๋Š” ์–ด๋–ป๊ฒŒ ์‚ฌ์šฉ์ž
10+
์ •์˜ ํ•จ์ˆ˜๋กœ ์ด์ค‘ ์—ญ์ „ํŒŒ๋ฅผ ์ง€์›ํ•˜๋Š”์ง€ ์•Œ๋ ค์ฃผ๊ณ  ์ฃผ์˜ํ•ด์•ผ ํ•  ์ ๋“ค์„ ์•ˆ๋‚ดํ•ฉ๋‹ˆ๋‹ค.
1211

13-
When writing a custom autograd function to backward through twice,
14-
it is important to know when operations performed in a custom function
15-
are recorded by autograd, when they aren't, and most importantly, how
16-
`save_for_backward` works with all of this.
1712

18-
Custom functions implicitly affects grad mode in two ways:
13+
์ด์ค‘ ์—ญ์ „ํŒŒ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ์‚ฌ์šฉ์ž ์ •์˜ autograd ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•  ๋•Œ,
14+
ํ•จ์ˆ˜ ๋‚ด์—์„œ ์–ด๋–ป๊ฒŒ ๋™์ž‘ํ•˜๋Š”์ง€ ์–ธ์ œ ๊ณ„์‚ฐ ๊ฒฐ๊ณผ๊ฐ€ ๊ธฐ๋ก๋˜๊ณ  ์–ธ์ œ ๊ธฐ๋ก๋˜์ง€
15+
์•Š๋Š”์ง€ ์ดํ•ดํ•˜๋Š” ๊ฒƒ์ด ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค. ํŠนํžˆ ์ „์ฒด ๊ณผ์ •์—์„œ `save_for_backward` ๊ฐ€
16+
์–ด๋–ป๊ฒŒ ๋™์ž‘ํ•˜๋Š”์ง€ ์•„๋Š” ๊ฒƒ์ด ๊ฐ€์žฅ ์ค‘์š”ํ•ฉ๋‹ˆ๋‹ค.
1917

20-
- During forward, autograd does not record any the graph for any
21-
operations performed within the forward function. When forward
22-
completes, the backward function of the custom function
23-
becomes the `grad_fn` of each of the forward's outputs
18+
์‚ฌ์šฉ์ž ์ •์˜ ํ•จ์ˆ˜๋Š” ์•”๋ฌต์ ์œผ๋กœ grad ๋ชจ๋“œ์— ๋‘ ๊ฐ€์ง€ ๋ฐฉ์‹์œผ๋กœ ์˜ํ–ฅ์„ ์ค๋‹ˆ๋‹ค.
2419

25-
- During backward, autograd records the computation graph used to
26-
compute the backward pass if create_graph is specified
20+
- ์ˆœ์ „ํŒŒ๋ฅผ ์ง„ํ–‰ํ•˜๋Š” ๋™์•ˆ autograd๋Š” ์ˆœ์ „ํŒŒ ํ•จ์ˆ˜์•ˆ์—์„œ ๋™์ž‘ํ•˜๋Š”
21+
์–ด๋–ค ์—ฐ์‚ฐ๋„ ๊ทธ๋ž˜ํ”„์— ๊ธฐ๋กํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. ์ˆœ์ „ํŒŒ๊ฐ€ ๋๋‚˜๊ณ  ์‚ฌ์šฉ์ž ์ •์˜ ํ•จ์ˆ˜์˜
22+
์—ญ์ „ํŒŒ๋Š” ์ˆœ์ „ํŒŒ์˜ ๊ฒฐ๊ณผ์˜ `grad_fn` ์ด ๋ฉ๋‹ˆ๋‹ค.
2723

28-
Next, to understand how `save_for_backward` interacts with the above,
29-
we can explore a couple examples:
24+
- ์—ญ์ „ํŒŒ๊ฐ€ ์ง„ํ–‰๋˜๋Š” ๋™์•ˆ create_graph๊ฐ€ ์ง€์ •๋˜์–ด ์žˆ๋‹ค๋ฉด
25+
autograd๋Š” ์—ญ์ „ํŒŒ์˜ ์—ฐ์‚ฐ์„ ๊ทธ๋ž˜ํ”„์— ๊ธฐ๋กํ•ฉ๋‹ˆ๋‹ค.
3026

27+
๋‹ค์Œ์œผ๋กœ, `save_for_backward` ๊ฐ€ ์œ„์˜ ๋‚ด์šฉ๊ณผ ์–ด๋–ป๊ฒŒ ์ƒํ˜ธ์ž‘์šฉํ•˜๋Š”์ง€ ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•ด์„œ,
28+
๋ช‡ ๊ฐ€์ง€ ์˜ˆ์‹œ๋ฅผ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
3129

32-
Saving the Inputs
30+
31+
์ž…๋ ฅ๊ฐ’ ์ €์žฅํ•˜๊ธฐ
3332
-------------------------------------------------------------------
34-
Consider this simple squaring function. It saves an input tensor
35-
for backward. Double backward works automatically when autograd
36-
is able to record operations in the backward pass, so there is usually
37-
nothing to worry about when we save an input for backward as
38-
the input should have grad_fn if it is a function of any tensor
39-
that requires grad. This allows the gradients to be properly propagated.
33+
๊ฐ„๋‹จํ•œ ์ œ๊ณฑ ํ•จ์ˆ˜๋ฅผ ์ƒ๊ฐํ•ด ๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค. ์ด ํ•จ์ˆ˜๋Š” ์—ญ์ „ํŒŒ๋ฅผ ์œ„ํ•ด์„œ ์ž…๋ ฅ ํ…์„œ๋ฅผ ์ €์žฅํ•ฉ๋‹ˆ๋‹ค.
34+
์—ญ์ „ํŒŒ ๊ณผ์ •์„ autograd๊ฐ€ ๊ธฐ๋กํ•  ์ˆ˜ ์žˆ๋‹ค๋ฉด ์ด์ค‘ ์—ญ์ „ํŒŒ๋Š” ์ž๋™์œผ๋กœ ๋™์ž‘ํ•ฉ๋‹ˆ๋‹ค.
35+
๋”ฐ๋ผ์„œ ์—ญ์ „ํŒŒ๋ฅผ ์œ„ํ•ด ์ž…๋ ฅ์„ ์ €์žฅํ•  ๋•Œ๋Š” ์ผ๋ฐ˜์ ์œผ๋กœ ๊ฑฑ์ •ํ•  ํ•„์š”๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
36+
์ž…๋ ฅ์ด grad๋ฅผ ์š”๊ตฌํ•˜๋Š” ํ…์„œ๋ถ€ํ„ฐ ๊ณ„์‚ฐ๋œ ํ•จ์ˆ˜๋ผ๋ฉด grad_fn์„ ๊ฐ€์ง€๊ณ  ์žˆ๊ณ 
37+
์ด๋ฅผ ํ†ตํ•ด์„œ ๋ณ€ํ™”๋„๊ฐ€ ์˜ฌ๋ฐ”๋ฅด๊ฒŒ ์ „ํŒŒ๋˜๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
4038

4139
.. code:: python
4240
@@ -64,7 +62,7 @@ that requires grad. This allows the gradients to be properly propagated.
6462
torch.autograd.gradgradcheck(Square.apply, x)
6563
6664
67-
We can use torchviz to visualize the graph to see why this works
65+
torchviz๋กœ ๊ทธ๋ž˜ํ”„๋ฅผ ์‹œ๊ฐํ™”ํ•ด์„œ ์ž‘๋™์›๋ฆฌ๋ฅผ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
6866

6967
.. code-block:: python
7068
@@ -75,18 +73,17 @@ We can use torchviz to visualize the graph to see why this works
7573
grad_x, = torch.autograd.grad(out, x, create_graph=True)
7674
torchviz.make_dot((grad_x, x, out), {"grad_x": grad_x, "x": x, "out": out})
7775
78-
We can see that the gradient wrt to x, is itself a function of x (dout/dx = 2x)
79-
And the graph of this function has been properly constructed
76+
x์— ๋Œ€ํ•œ ๋ณ€ํ™”๋„๊ฐ€ ๊ทธ ์ž์ฒด๋กœ x์˜ ํ•จ์ˆ˜๋ผ๋Š” ๊ฒƒ์„ ํ™•์ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค(dout/dx = 2x).
77+
์ด ํ•จ์ˆ˜์— ๋Œ€ํ•œ ๊ทธ๋ž˜ํ”„๋„ ์ œ๋Œ€๋กœ ์ƒ์„ฑ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
8078

8179
.. image:: https://user-images.githubusercontent.com/13428986/126559699-e04f3cb1-aaf2-4a9a-a83d-b8767d04fbd9.png
8280
:width: 400
8381

8482

85-
Saving the Outputs
83+
๊ฒฐ๊ณผ ์ €์žฅํ•˜๊ธฐ
8684
-------------------------------------------------------------------
87-
A slight variation on the previous example is to save an output
88-
instead of input. The mechanics are similar because outputs are also
89-
associated with a grad_fn.
85+
์ด์ „ ์˜ˆ์ œ๋ฅผ ์กฐ๊ธˆ ๋ณ€ํ˜•ํ•˜๋ฉด ์ž…๋ ฅ๋Œ€์‹  ์ถœ๋ ฅ์„ ์ €์žฅํ• ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
86+
์ถœ๋ ฅ๋„ grad_fn๊ณผ ์—ฐ๊ฒฐ๋˜๊ธฐ์— ๋ฐฉ์‹์€ ๋น„์Šทํ•ฉ๋‹ˆ๋‹ค.
9087

9188
.. code-block:: python
9289
@@ -111,7 +108,7 @@ associated with a grad_fn.
111108
torch.autograd.gradcheck(Exp.apply, x)
112109
torch.autograd.gradgradcheck(Exp.apply, x)
113110
114-
Use torchviz to visualize the graph:
111+
torchviz๋กœ ๊ทธ๋ž˜ํ”„ ์‹œ๊ฐํ™”ํ•˜๊ธฐ.
115112

116113
.. code-block:: python
117114
@@ -123,23 +120,22 @@ Use torchviz to visualize the graph:
123120
:width: 332
124121

125122

126-
Saving Intermediate Results
123+
์ค‘๊ฐ„ ๊ฒฐ๊ณผ ์ €์žฅํ•˜๊ธฐ
127124
-------------------------------------------------------------------
128-
A more tricky case is when we need to save an intermediate result.
129-
We demonstrate this case by implementing:
125+
์ค‘๊ฐ„ ๊ฒฐ๊ณผ๋ฅผ ์ €์žฅํ•˜๋Š” ๊ฒƒ์€ ์ข€ ๋” ์–ด๋ ต์Šต๋‹ˆ๋‹ค.
126+
๋‹ค์Œ์„ ๊ตฌํ˜„ํ•˜์—ฌ ๋ณด์—ฌ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.
130127

131128
.. math::
132129
sinh(x) := \frac{e^x - e^{-x}}{2}
133130
134-
Since the derivative of sinh is cosh, it might be useful to reuse
135-
`exp(x)` and `exp(-x)`, the two intermediate results in forward
136-
in the backward computation.
131+
sinh์˜ ๋„ํ•จ์ˆ˜๋Š” cosh์ด๋ฏ€๋กœ, ์ˆœ์ „ํŒŒ์˜ ์ค‘๊ฐ„ ๊ฒฐ๊ณผ์ธ
132+
`exp(x)` ์™€ `exp(-x)` ๋ฅผ ์—ญ์ „ํŒŒ ๊ณ„์‚ฐ์— ์žฌ์‚ฌ์šฉํ•˜๋ฉด ํšจ์œจ์ ์ž…๋‹ˆ๋‹ค.
137133

138-
Intermediate results should not be directly saved and used in backward though.
139-
Because forward is performed in no-grad mode, if an intermediate result
140-
of the forward pass is used to compute gradients in the backward pass
141-
the backward graph of the gradients would not include the operations
142-
that computed the intermediate result. This leads to incorrect gradients.
134+
์ค‘๊ฐ„ ๊ฒฐ๊ณผ๋ฅผ ์ง์ ‘ ์ €์žฅํ•˜์—ฌ ์—ญ์ „ํŒŒ์— ์‚ฌ์šฉํ•˜๋ฉด ์•ˆ ๋ฉ๋‹ˆ๋‹ค.
135+
์ˆœ์ „ํŒŒ๊ฐ€ no-grad ๋ชจ๋“œ์—์„œ ์‹คํ–‰๋˜๊ธฐ ๋•Œ๋ฌธ์—, ๋งŒ์•ฝ ์ˆœ์ „ํŒŒ์˜ ์ค‘๊ฐ„ ๊ฒฐ๊ณผ๊ฐ€
136+
์—ญ์ „ํŒŒ์—์„œ ๋ณ€ํ™”๋„๋ฅผ ๊ณ„์‚ฐํ•˜๋Š” ๋ฐ ์‚ฌ์šฉ๋˜๋ฉด ๋ณ€ํ™”๋„์˜ ์—ญ์ „ํŒŒ ๊ทธ๋ž˜ํ”„์—
137+
์ค‘๊ฐ„ ๊ฒฐ๊ณผ๋ฅผ ๊ณ„์‚ฐํ•œ ์—ฐ์‚ฐ๋“ค์ด ํฌํ•จ๋˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.
138+
๊ฒฐ๊ณผ์ ์œผ๋กœ ๋ณ€ํ™”๋„๊ฐ€ ๋ถ€์ •ํ™•ํ•ด์ง‘๋‹ˆ๋‹ค.
143139

144140
.. code-block:: python
145141
@@ -172,7 +168,7 @@ that computed the intermediate result. This leads to incorrect gradients.
172168
torch.autograd.gradgradcheck(sinh, x)
173169
174170
175-
Use torchviz to visualize the graph:
171+
torchviz๋กœ ๊ทธ๋ž˜ํ”„ ์‹œ๊ฐํ™”ํ•˜๊ธฐ.
176172

177173
.. code-block:: python
178174
@@ -184,12 +180,11 @@ Use torchviz to visualize the graph:
184180
:width: 460
185181

186182

187-
Saving Intermediate Results: What not to do
183+
์ค‘๊ฐ„ ๊ฒฐ๊ณผ ์ €์žฅํ•˜๊ธฐ: ์ž˜๋ชป๋œ ๋ฐฉ๋ฒ•
188184
-------------------------------------------------------------------
189-
Now we show what happens when we don't also return our intermediate
190-
results as outputs: `grad_x` would not even have a backward graph
191-
because it is purely a function `exp` and `expnegx`, which don't
192-
require grad.
185+
์ค‘๊ฐ„ ๊ฒฐ๊ณผ๋ฅผ ์ถœ๋ ฅ์œผ๋กœ ๋ฐ˜ํ™˜ํ•˜์ง€ ์•Š์œผ๋ฉด ์–ด๋–ค ์ผ์ด ๋ฐœ์ƒํ•˜๋Š”์ง€ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
186+
`grad_x` ๋Š” ์—ญ์ „ํŒŒ ๊ทธ๋ž˜ํ”„๋ฅผ ์•„์˜ˆ ๊ฐ–์ง€ ๋ชปํ•ฉ๋‹ˆ๋‹ค.
187+
์ด๊ฒƒ์€ `grad_x` ๊ฐ€ ์˜ค์ง grad๋ฅผ ํ•„์š”๋กœ ํ•˜์ง€ ์•Š๋Š” `exp` ์™€ `expnegx` ์˜ ํ•จ์ˆ˜์ด๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค.
193188

194189
.. code-block:: python
195190
@@ -211,8 +206,8 @@ require grad.
211206
return grad_input
212207
213208
214-
Use torchviz to visualize the graph. Notice that `grad_x` is not
215-
part of the graph!
209+
torchviz๋กœ ๊ทธ๋ž˜ํ”„ ์‹œ๊ฐํ™”ํ•˜๊ธฐ.
210+
`grad_x` ๊ฐ€ ๊ทธ๋ž˜ํ”„์— ํฌํ•จ๋˜์ง€ ์•Š๋Š” ๊ฒƒ์„ ํ™•์ธํ•˜์„ธ์š”!
216211

217212
.. code-block:: python
218213
@@ -225,15 +220,13 @@ part of the graph!
225220

226221

227222

228-
When Backward is not Tracked
223+
์—ญ์ „ํŒŒ ์ถ”์ ์ด ๋ถˆ๊ฐ€๋Šฅํ•œ ๊ฒฝ์šฐ
229224
-------------------------------------------------------------------
230-
Finally, let's consider an example when it may not be possible for
231-
autograd to track gradients for a functions backward at all.
232-
We can imagine cube_backward to be a function that may require a
233-
non-PyTorch library like SciPy or NumPy, or written as a
234-
C++ extension. The workaround demonstrated here is to create another
235-
custom function CubeBackward where you also manually specify the
236-
backward of cube_backward!
225+
๋งˆ์ง€๋ง‰์œผ๋กœ autograd๊ฐ€ ํ•จ์ˆ˜์˜ ์—ญ์ „ํŒŒ์— ๋Œ€ํ•œ ๋ณ€ํ™”๋„๋ฅผ ์ถ”์ ํ•  ์ˆ˜ ์—†๋Š”
226+
์ƒํ™ฉ์„ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค. cube_backward๊ฐ€ SciPy๋‚˜ NumPy ๊ฐ™์€
227+
์™ธ๋ถ€ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜๊ฑฐ๋‚˜ C++๋กœ ๊ตฌํ˜„๋˜์—ˆ๋‹ค๊ณ  ๊ฐ€์ •ํ•ด ๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.
228+
์ด๋Ÿฐ ๊ฒฝ์šฐ๋Š” CubeBackward๋ผ๋Š” ๋˜ ๋‹ค๋ฅธ ์‚ฌ์šฉ์ž ์ •์˜ ํ•จ์ˆ˜๋ฅผ ์ƒ์„ฑํ•˜์—ฌ
229+
cube_backward์˜ ์—ญ์ „ํŒŒ๋„ ์ˆ˜๋™์œผ๋กœ ์ง€์ •ํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค!
237230

238231

239232
.. code-block:: python
@@ -280,7 +273,7 @@ backward of cube_backward!
280273
torch.autograd.gradgradcheck(Cube.apply, x)
281274
282275
283-
Use torchviz to visualize the graph:
276+
torchviz๋กœ ๊ทธ๋ž˜ํ”„ ์‹œ๊ฐํ™”ํ•˜๊ธฐ.
284277

285278
.. code-block:: python
286279
@@ -292,10 +285,9 @@ Use torchviz to visualize the graph:
292285
:width: 352
293286

294287

295-
To conclude, whether double backward works for your custom function
296-
simply depends on whether the backward pass can be tracked by autograd.
297-
With the first two examples we show situations where double backward
298-
works out of the box. With the third and fourth examples, we demonstrate
299-
techniques that enable a backward function to be tracked, when they
300-
otherwise would not be.
288+
๊ฒฐ๋ก ์ ์œผ๋กœ ์‚ฌ์šฉ์ž ์ •์˜ ํ•จ์ˆ˜์˜ ์ด์ค‘ ์—ญ์ „ํŒŒ ์ž‘๋™ ์—ฌ๋ถ€๋Š” autograd๊ฐ€
289+
์—ญ์ „ํŒŒ ๊ณผ์ •์„ ์ถ”์ ํ•  ์ˆ˜ ์žˆ๋А๋ƒ์— ๋‹ฌ๋ ค ์žˆ์Šต๋‹ˆ๋‹ค. ์ฒ˜์Œ ๋‘ ์˜ˆ์ œ์—์„œ๋Š”
290+
์ด์ค‘ ์—ญ์ „ํŒŒ๊ฐ€ ์ž๋™์œผ๋กœ ๋™์ž‘ํ•˜๋Š” ๊ฒฝ์šฐ๋ฅผ ๋ณด์—ฌ์ฃผ์—ˆ๊ณ ,
291+
์„ธ ๋ฒˆ์งธ์™€ ๋„ค ๋ฒˆ์งธ ์˜ˆ์ œ๋Š” ์ถ”์ ๋˜์ง€ ์•Š๋Š” ์—ญ์ „ํŒŒ ํ•จ์ˆ˜๋ฅผ
292+
์ถ”์  ๊ฐ€๋Šฅํ•˜๊ฒŒ ๋งŒ๋“œ๋Š” ๋ฐฉ๋ฒ•์„ ์„ค๋ช…ํ–ˆ์Šต๋‹ˆ๋‹ค.
301293

0 commit comments

Comments
ย (0)