As usual you will follow the Deep Learning methodology to build the model:
1. Initialize parameters / Define hyperparameters
2. Loop for num_iterations:
a. Forward propagation
b. Compute cost function
c. Backward propagation
d. Update parameters (using parameters, and grads from backprop)
4. Use trained parameters to predict labels
Initialize the parameters for a two-layer network and for an $L$-layer neural network.
Implement the forward propagation module (shown in purple in the figure below).
Complete the LINEAR part of a layer’s forward propagation step (resulting in $Z^{[l]}$).
We give you the ACTIVATION function (relu/sigmoid).
Combine the previous two steps into a new [LINEAR->ACTIVATION] forward function.
Stack the [LINEAR->RELU] forward function L-1 time (for layers 1 through L-1) and add a [LINEAR->SIGMOID] at the end (for the final layer $L$). This gives you a new L_model_forward function.
Compute the loss.
Implement the backward propagation module (denoted in red in the figure below).
Complete the LINEAR part of a layer’s backward propagation step.
We give you the gradient of the ACTIVATE function (relu_backward/sigmoid_backward)
Combine the previous two steps into a new [LINEAR->ACTIVATION] backward function.
Stack [LINEAR->RELU] backward L-1 times and add [LINEAR->SIGMOID] backward in a new L_model_backward function
/* 有一个分数序列:2/1,3/2,5/3,8/5……求出这个数列的前n项之和. */ #include <iostream> #include <cstdio> using namespace std; int main() { int n=0; float sum=0; float a = 1.0; float b = 2.0; cin>>n; if(n<1||n>40) return 0; for(int i=0; i< n;i++){ sum = sum + b/a; b = a + b; a = b - a; } cout<<sum<<endl; printf("%.6f",sum); return 0; }
/* 由键盘输入一行仅由英文字母及空格组成的字符,编程实现(相邻单词之间用一个空格或多个空格隔开) (1)输出每个单词及其长度 (2)输出最长的单词 例如:输入:I am a boy 输出:I 1 am 2 a 1 boy 3 The longest word is:boy */ #include <iostream> #include <cstdio>
using namespace std; int main(){ char a[1000]; char b[30];
int i,j,m=0,n=0; //m记录最长的单词长度,j保存临时的单词长度,n记录最长单词的最后一位 printf("请输入字符:"); gets(a);