There was no feature flag check when a new module was
created via the modal. This fixes it by checking for the
feature flag and rendering mathml if needed after a successful
request.
fixes LS-1776
flag=none
Change-Id: I867e65abee7d78f15e66dc003ec28dd903982ef9
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/257906
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Ed Schiebel <eschiebel@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Eric Saupe <eric.saupe@instructure.com>
closes OUT-4122
flag=enable_webcam_submission
test plan:
- Enable enable_webcam_submission ff
- create an assignment with submission type "file uploads"
- as a student, open the assignment > submit assignment
- Assert Send File or Use Webcam button are displayed
- Click in Send File
Assert the legacy file upload is displayed
Assert the input works (submit)
- Click in Use Webcam
Assert you can take picture using webcam
- Click Add another file
Assert you see Send File or Use Webcam
- Click Submit assignment
Assert you can download the webcam image and
the file in the legacy input
Change-Id: I71d12a5de7d62c7b2774f9588da9ab086de7888d
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/254862
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Chrystal Langston <chrystal.langston@instructure.com>
Product-Review: Jody Sailor
Reviewed-by: Stephen Kacsmark <skacsmark@instructure.com>
Reviewed-by: Michael Brewer-Davis <mbd@instructure.com>
closes LS-1864
flag=new_math_equation_handling
with a touch of inline_math_everywhere
updates:
- get inline math in links underlined
- prevent MathJax from typesetting equations when editing a quiz
- prevent MathJax from loading on rubrics or files pages
test plan:
- go to /courses/:id/rubrics
- put inline math in a rubric
> expect it to never get typeset by mathjax
- create an assignment with inline math in the title
- go to /courses/:id/assignments
- hover over the assignment's title
> expect the text and the equation to be underlined
- to to /courses/:id/quizzes
- create a quiz, add a question
- put put inline math in answers and an answer comment
- save the question and quiz
- preview the quiz and take it
> expect the math to be typeset
- edit the quiz, go to questions tab
- show question details
- using the equation editor, add or edit an equation in the question
- save the question and quiz
- preview/take the quiz
> expect the equations to be typeset and look correct
- edit the quiz and go to questions tab again
- show question details
> expect the inline equations to be the expected not typeset source
- go to a files page
- rename a file to \(x^2\).whatever
- refresh the page
> the filename should be "\(x^2\)" and not be typeset by MathJax
Change-Id: I62d3d7fb94c91d46de4e6e9689ba345b279c1f2f
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/257981
Reviewed-by: Jeff Largent <jeff.largent@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
Removing the fixed width in the RCE at the moment of loading the
page and Refactoring a bit the HTML table in answers form to make
the RCE use a row instead of a column to the RCE fits the container
width automatically
fixes LS-1803
flag=none
Test Plan:
- In a course, enable the RCE enhancements feature.
- Create a Quiz and add a new question.
- On the Answers section add a comment to an answer to Display the
RCE.
- Expand or contract the window.
- The RCE of all of the answers gets responsive and adapts to the
width of the window
Change-Id: I8f014e4059cfc9ca33516da4aa189a039949a3cb
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/257595
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Product-Review: Jonathan Guardado <jonathan.guardado@instructure.com>
Reviewed-by: Ed Schiebel <eschiebel@instructure.com>
QA-Review: Ed Schiebel <eschiebel@instructure.com>
closes FOO-1501
refs FOO-130
flag = granular_permissions_course_files
[fsc-max-nodes=18]
Test Plan:
• see test plan outlined in base commit: g/253777
Change-Id: I33984062fd236348d39262395e5f51335e327ed9
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256914
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Michael Ziwisky <mziwisky@instructure.com>
Reviewed-by: Charley Kline <ckline@instructure.com>
QA-Review: August Thornton <august@instructure.com>
Product-Review: August Thornton <august@instructure.com>
closes QUIZ-8085
flag=new_quizzes_modules_support
test plan:
- with the FF enabled:
- on the modules index page, open the
"add content to module" modal
- select "Quiz" from the dropdown and click [ Create Quiz ]
- if you have not opted for a particular engine yet
you should be presented with the choice between Classic
and New Quizzes (both should work)
- if you have opted for your quiz engine choice to be saved,
your choice should be used automatically, and you will
not have the option here
- with the FF disabled:
- creating a quiz by clicking [ Create Quiz ] should always
create a classic quiz, regardless of saved preference
Change-Id: Id53a95dd3e2f867167abbabefaf22e09b0bdfe89
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/257075
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Stephen Kacsmark <skacsmark@instructure.com>
QA-Review: Mark McDermott <mmcdermott@instructure.com>
Product-Review: Susan Sorensen <susan.sorensen@instructure.com>
eportfolios has a bug where it won't use the new rce
closes LS-1424
flag=rce_auto_save
the rce_enhancements flag is a prereq
test plan:
- turn on the rce_auto_save flag
- type some stuff in the rce
- click cancel or submit/save/done/etc
> you should not be prompted about autosaved content
- refresh the page w/o cancelling or saving
> you should be prompted to restore autosaved content
- both create and edit
course syllabus
assignment
submit text entry assignment as a student
re-submit text entry assignment
course announcement
course discussion
edit discusion
reply to discussion
edit reply to discussion
outcome
pages
quiz
- quiz questions and answers never autosave since we only
autosave when there is only 1 RCE on the page and
the quiz description on the details tab is the 1st RCE
- account-level announcements do not autosave, so you should never
be prompted
- eportfolios will not use the new RCE. there's a ticket for that
Change-Id: I0639bcffc81de54073173da5325200d02f374321
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/257005
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jeremy Stanley <jeremy@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
closes QUIZ- 8089
flag=new_quizzes_modules_support
test plan:
- create a module as well as some Classic and New Quizzes
in a course
- on the modules page (and/or with the modules page set as the home page)
adding existing Classic and New Quizzes to a module should work
- test adding multiple quizzes at once
- creating a new Classic Quiz from the modal shold work
- creating a classic quiz at the same time as adding other
quizzes should work
- editing a quiz in a module should work (kebab menu)
- indenting a quiz in a module should work (kebab menu)
- with the FF disabled, there should be no regressions
Change-Id: I6e6b0a30860a90b2344450cdc202a6b96ad86fd2
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256471
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Stephen Kacsmark <skacsmark@instructure.com>
QA-Review: Mark McDermott <mmcdermott@instructure.com>
Product-Review: Susan Sorensen <susan.sorensen@instructure.com>
Upon click on Find button in the Outcomes tab, a modal should open
The modal’s title should be “Add Outcomes to %{context}” where
context is Account or Course. There should be a split view under
the border with a title on the left (‘Outcome Groups’) along with
image and helper text on the right
closes OUT-4036
flag=improved_outcomes_management
Test plan:
- Go to Account > Settings > Feature Options
- Enable Improved Outcomes Management FF
- With Improved Outcomes Management FF Enabled
- Go to Account > Outcomes
- click on Find button; a fullscreen modal should open
- modal title should say "Add Outcomes to Account"
- click on X button of modal; modal should close
- click on Done button of modal; modal should close
- Go to Courses and select a course, then select Outcomes
- click on Find button; a fullscreen modal should open
- modal title should say "Add Outcomes to Course"
- click on X button of modal; modal should close
- click on Done button of modal; modal should close
- verify that modal visualization is similar under
Chrome, Firefox and Safari
Change-Id: Ibfad684e6dbf0a62396d03c5932c10377676f20f
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256572
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Michael Brewer-Davis <mbd@instructure.com>
Reviewed-by: Pat Renner <prenner@instructure.com>
QA-Review: Pat Renner <prenner@instructure.com>
Product-Review: Jody Sailor
Closes INTEROP-6430
flag=none
Support for the deep linking request was added
for the collaborations placement in 930f6ed.
This change is the second half and allows LTI
1.3 tools to return content items at that
placement.
Test Plan:
1. Install an LTI 1.3 tool that handles deep
linking requests
2. Install an LTI 1.1 tool that supports
content item requests (the LTI 1.1
example tool on GitHub works here).
3. Navigate to the collaborations page
of a course.
4. Create a new collaboration using the LTI
1.3 tool. Verify the following during
creation:
- Specifying a "Message" in the deep
linking response shows that message
in a flash message
- Specifying an "Error Message" in the
deep linking response shows that
error message in a flash message
- Upon collaboration creation, Canvas
launches the tool to the URL given in
the deep linking response in a new
tab.
5. Refresh the collaborations page and click
on the new collaboration. Verify the tool
launches to the correct URL in a new tab.
6. Create a new collaboration using the LTI
1.x tool. Verify the same items listed
in step 4.
7. Verify collaborations can be deleted via
the "delete" icon.
Change-Id: I756b94410b1d8c527e698debd84138008feb8937
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256594
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Wagner Goncalves <wagner.goncalves@instructure.com>
QA-Review: Wagner Goncalves <wagner.goncalves@instructure.com>
Product-Review: Karl Lloyd <karl@instructure.com>
This reverts commit 9af1badd2f.
Change-Id: I159a478d1f3baa062148e3fd5ff4ee914176579c
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256982
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Simon Williams <simon@instructure.com>
QA-Review: Simon Williams <simon@instructure.com>
Product-Review: Simon Williams <simon@instructure.com>
Don't show this warning/error when returning module list.
It doesn't seem to hurt but is confusing.
Test plan:
- Add module item or items with deep linking
- Observe that it works and there is no error in the JS console
refs INTEROP-6446
flag=none
Change-Id: Ibd125d28195e31a29a2e5a2d491dbdb5a4720a22
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256501
Reviewed-by: Xander Moffatt <xmoffatt@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Ryan Hawkins <ryan.hawkins@instructure.com>
Product-Review: Evan Battaglia <ebattaglia@instructure.com>
This reverts commit def240c3c5.
This feature flag is no longer needed. And, there was actually a bug in
it which prevented adding multiple content items from working right: a
the js ENV key was called PROCESS_MULTIPLE_CONTENT_ITEMS but we were
checking ENV.process_multiple_content_items_modules_index
The module items were actually being added, but the dialog was not being
closed and the page refreshed as it should have been.
Note that an error (warning?) "invalid messageType:
LtiDeepLinkingResponse" appears in the console still. This appears to
not affect any functionality but I will look at fixing it soon.
Also note that sometimes when processing the deep linking response, I
will apparently get logged out of Canvas. I'm not sure what that's all
about, but this commit shouldn't make that any worse.
Closes INTEROP-6446
flag=process_multiple_content_items_modules_index
Test plan:
- Have the LTI 1.3 test tool installed with the Resource Selection or
Link Selection placement
- From the module index page, add a module item and choose "External
Tool" and choose the LTI 1.3 test tool
- Select "Return multiple Content items" and click "Submit" in the tool
- Three new module items should be created, and the "Add Item to..."
dialog should be closed, and the page refreshed to show the new items.
Change-Id: I85391199493eb0f96b7861c7e9d2318553bc5c91
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256472
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Xander Moffatt <xmoffatt@instructure.com>
QA-Review: Mysti Lilla <mysti@instructure.com>
Product-Review: Evan Battaglia <ebattaglia@instructure.com>
This supports soft-deleting users
refs FOO-921
test plan:
- Create a user and navigate to their profile page
- Click the Delete from ALL accounts button
- After a confirmation dialog, the user should be deleted
Change-Id: I3a1a4c07db68d3d188319c31daf278c669ba5c14
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256529
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Simon Williams <simon@instructure.com>
QA-Review: Simon Williams <simon@instructure.com>
Product-Review: Simon Williams <simon@instructure.com>
fixes FOO-1409
flag = none
no more client_apps, canvas_quizzes now lives as part of canvas-lms
proper inside app/jsx/, which makes the build leaner and leaves us with
one less thing to reason about
logical changes:
- converted from AMD to ES modules
- upgraded to recent react + react-router
- dropped RSVP in favor of native Promises
- used CanvasModal instead of home-grown Dialog
- removed dead code; notifications in particular were fishy as there had
no dependents at all and did not even show up in the graph
- ported tests to Jest, added more unit ones and two integration ones
- removed "config.onError" and now throws errors where appropriate
- disabled console statements in non-dev
:: test plan ::
- create a (old-school) quiz containing all types of questions
- as 3 distinct students, take the quiz and try to randomize your
answers
at this point it's helpful to have a reference to compare the screens; I
replicated the quiz on my production sandbox for this
- go to /courses/:id/quizzes/:id/submissions/:id/log
- verify it looks OK
- click on a specific question in the stream and verify the question
inspector widget works OK
- go back to stream and push "View table"
- verify the table and its controls are OK
- go to /courses/:id/quizzes/:id/statistics
- verify it looks OK
- click on ? in the discrimination index chart and verify it displays
a dialog with help content
- click on "X respondents" in one of the charts and verify it displays
a dialog with the respondent names
- verify the interactive charts do interact as expected (no logic
changed here so just a quick glance)
- link to "View in SpeedGrader" for essay-like questions works
Change-Id: I79af5ff4f1479503b5e2528b613255dde5bc45d3
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256118
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Simon Williams <simon@instructure.com>
QA-Review: Simon Williams <simon@instructure.com>
Product-Review: Simon Williams <simon@instructure.com>
when start/end dates are not completely specified in content imports,
canvas looks for the first and last event/due dates in the course to
help with the math. the problem is that it also uses these dates to
set the course start / conclude dates. if one of these dates is given
and the other is implied, the validation check in the migration code
is circumvented and the Course model validation fails instead, which
causes the migration to fail and some post-migration housekeeping
(such as cached due dates) does not run
I think it doesn't make sense to make the implicit start/end dates
explicit in the first place, so I prefer to take them out rather
than fix the validation.
test plan:
- have a course with no start/end dates and one assignment
with a due date
- create an empty course shell to copy into. enroll a
student in the course.
- perform a course copy, choose to shift dates, and leave
three dates blank, specifying only the new end date,
and give a date that is *earlier* than the assignment's
due date
- the migration should succeed
- speedgrader should be able to view the assignment in
the destination course
fixes LS-1670
Change-Id: Ic50004fb53f91cb2d048ab47bfbcafbb410cff59
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/256404
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Robin Kuss <rkuss@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Jeremy Stanley <jeremy@instructure.com>
closes LS-1733
flag=rce_pretty_html_editor
requires rce_enhancements to be on as well. I'm not sure the
new flag is necesary, and as of PS1, it doesn't fully hide the
new html editor functionality.
The INSTUI CodeEditor component uses CodeMirror v5, which sadly
is not accessible. https://github.com/codemirror/codemirror.next,
or https://codemirror.net/6/ for details on a future version.
For now, the RCE still provides access to the raw textarea if
KB access is needed by the user.
BONUS FEATURE! the html editors can now be viewed fullscreen
test plan:
- with the Pretty html editor feature flag off
- click the </> button
> expect the old boring html editor
- turn on Pretty HTML Editor feature flag
- click the </> button
> expect the deluxe new html editor
- click the "Raw HTML Editor" link
> expect the old boring editor
- click the "Pretty HTML Editor" link
> expect the pretty editor
- click the </>
> expect to be back in the rce
- shift-click the </>
> expect the old editor
- from anywhere, click the fullscreen button (except in safari
when in the old editor, safari won't fullscreen the textarea so
the button should be hidden)
> expect to be in fullscreen
- if you fullscreened the RCE, you can select "View > HTML Editor"
from the menubar
> expect to be in the html editor, fullscreened
- ESC
> expect to exit fullscreen
- edit your content anywhere
> expect the changed to be reflected everywhere else
- try it on a discussions or quizzes page with >1 RCE
> the editors and fullscreen should work as you expect
Change-Id: If5b17b2357a4ff5521f0cb9c42bd6a5a096f2436
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/255928
Reviewed-by: Jeff Largent <jeff.largent@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Jeff Largent <jeff.largent@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
fixes FOO-1408
this allows us to control which files i18nliner processes completely
through config and without having to modify hardcoded paths in source
file, which is something we need for FOO-1265
.i18nrc files can include other directories through the "include"
directive:
// file: canvas-lms/.i18nrc
{ "include": [ "public/javascripts/.i18nrc" ] }
// file: public/javascripts/.i18nrc
{ "files": [...] }
:: test plan
aside of Jenkins exercising the i18n tasks, I ran a diff by hand over
the set of files that i18nliner processses before and after the patch,
with the new code processing a few more files: some handlebars in the
analytics plugin and the 3 client_apps/canvas_quizzes source files
if you really want to, you can do the same or find another way to verify
the output
on master, edit canvas_i18nliner/js/main.js somewhere before the
exports:
Check.prototype.checkWrapper = f => console.warn(f)
run it:
./gems/canvas_i18nliner/bin/i18nliner check 2>
tmp/i18nliner-upstream-files.txt
cat tmp/i18nliner-upstream-files.txt | sort >
tmp/i18nliner-upstream.txt
now do similar on our branch (although we need to massage teh output
because the paths are absolute:)
./gems/canvas_i18nliner/bin/i18nliner check 2>
tmp/i18nliner-patched-files.txt
cat tmp/i18nliner-patched-files.txt | sort >
tmp/i18nliner-patched.txt
sed -i "s{$PWD/{{" tmp/i18nliner-patched.txt
now look for differences:
git diff --no-index \
tmp/i18nliner-upstream.txt \
tmp/i18nliner-patched.txt
Change-Id: Ic73cbc7261ab597deb567fc5d0af1e3014875da1
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/255952
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Simon Williams <simon@instructure.com>
QA-Review: Simon Williams <simon@instructure.com>
Product-Review: Simon Williams <simon@instructure.com>
For postMessage issue, see b3e640ac52 for
earlier, partial fix. Documentation (e.g.
https://github.com/bracken/lti_messaging#ltiscreenreaderalert) confirms
`body` should be a string.
flag=none
closes INTEROP-6416
Test plan:
- For postMessage issue: see ticket for repro steps. Check that it can
be repro'd before this fix but not after.
- In repro steps, open dev tools on the Canvas page, open the Elements
tab, and observe that the "flash_screenreader_holder" div is being
updated every second with the JSON of the body.
- Modify the repro HTML file to send a string for body and observe that
that makes the "flash_screenreader_holder" div contain just the string
(with no extra quotes).
- For deep linking issue: modify the LTI 1.3 test tool so
(app/controllers/deep_linking_controller.rb:20) so that it returns
hashes like this for "message" and "error_message":
{html: '<img src=x onerror=alert(123)'}
- Use the LTI tool return data via deep linking (e.g. adding a
module item, embedding a link in the RCE)
- Before this change that should cause an alert; after this change the
HTML (escaped) should be shown in the flash message.
- Test the content-item code path with an LTI 1.1 tool. I modified
the lti_tool_provider_example code (form in content_item_form.js.jsx).
It seems like something is already turning a hash parameter into a
string though, so it seems like the changes to
external_tool_controller are not necessary, but they don't hurt.
Change-Id: I4a23b4c4173db0fec2ec745001da5d8c6d54997c
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/255758
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Tucker Mcknight <tmcknight@instructure.com>
Reviewed-by: Weston Dransfield <wdransfield@instructure.com>
Reviewed-by: Mysti Lilla <mysti@instructure.com>
Product-Review: Weston Dransfield <wdransfield@instructure.com>
If a CSV file is imported and contains override score changes (which
will only be processed in the first place if the relevant feature flag
is on), show the changes on the upload confirmation screen and allow the
user to edit them as with assignments. Do not actually apply the
changes yet.
closes EVAL-1351
flag=import_override_scores_in_gradebook
Test plan:
- With the "Import Override Scores in Gradebook" feature ON:
- Upload a CSV with changes to override scores
- As with assignments, if any student's override grade has changed for
a given grading period, the upload confirmation screen should show
the old and new override scores for all students for that grading
period
- Ditto for override scores for the whole course
- Any changes that would decrease a student's override score, or
remove a score, should be highlighted in red
- You should be able to edit the scores inline and have the changes
stick
- Note that confirming/submitting will not actually submit the
override changes
- With the "Import Override Scores in Gradebook" feature OFF:
- Changes to override scores in imported CSVs should not be
acknowledged on the confirmation page
Change-Id: Ie27a88bda45ce39f6e879c92d8ab2b38685797ad
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/254944
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Adrian Packel <apackel@instructure.com>
Product-Review: Jody Sailor
Reviewed-by: Gary Mei <gmei@instructure.com>
Reviewed-by: Spencer Olson <solson@instructure.com>
closes LS-1609
flag=rce_enhancements
Had to nohiost tinymce so it remains w/in canvas-rce
where RCEWrapper.js can require tinymce's stylesheets, but it
still needs to be a dev dep. of canvas-lms because there are
specs that import tinuymce.
Moved the matchMedia jsdon polyfill into jest-setup where it can be
used by everyone. Need it there since the new tinymce calls it.
test plan:
- specs pass
- the RCE still works and looks the same
Sorry, I don't know what else to say
Change-Id: I8c956664176b7c25995a55e0c6fea4dafad3970f
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/255604
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Nate Armstrong <narmstrong@instructure.com>
QA-Review: Nate Armstrong <narmstrong@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
I didn't take into account we might get handed an element that's not
attached to the body.
closes LS-1715
flag=none
test plan:
- load /courses/:id/pages
> expect no exception to be thrown from mathml.isMathJaxIgnored
in particular:
Uncaught TypeError: Cannot read property 'querySelector' of null
or
Uncaught TypeError: Cannot read property 'classList' of nul
Change-Id: I079b57521574b5ce7a3ed11ced4a985d47762e04
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/255380
Reviewed-by: Jeff Largent <jeff.largent@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
We are adding the ability to save the custom params from an external
tool at assignment selection placement.
closes=INTEROP-6311
flag=none
test-plan:
* Have a LTI tool installed in your local Canvas, you can use the
lti-1.3.-test-tool for it;
* Have a Course recorded;
* When creating a new Assignment:
1. Choose `External Tool` from the Submission Type field;
2. Click in the find button to select `LTI Test Tool` at
`Configure External Tool` modal;
3. After the LTI tool was lunched, you will be able to fill Custom Params
text area with a custom JSON and click on submit;
4. At this point you will be able to check that the hidden input was
filled with the custom JSON, for example you can access the browser
console and execute `$('#external_tool_create_custom_params').val()`;
5. After selecting the tool and saving the assignment, you will be able
to check the Assignment into the database, for example:
Assignment.find(assignment).lti_resource_links.first.custom;
Change-Id: I5f1b1143eec035eb18f814439c6ae7077bcab8bf
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/254453
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Weston Dransfield <wdransfield@instructure.com>
QA-Review: Weston Dransfield <wdransfield@instructure.com>
Product-Review: Weston Dransfield <wdransfield@instructure.com>
closes OUT-4111
flag=account_level_mastery_scales
test-plan:
- Enable the FF, if not already
- Create an account rubric with at least one attached outcome
- Login as a teacher and view a course
- Create an assignment
- Create a mastery scale within this course that differs
from the account scale
- Attach the account rubric to the assignment
- Note that the account rubric should not have changed
- Edit the rubric within the assignment page (pencil icon)
- Ensure a prompt appears noting that the course mastery
scale will be used
- Press confirm and note that the learning outcome criterion are
now using the course mastery scale
- Ensure cancelling out of the edit screen leaves the rubric
unchanged and the criterion revert back to using the account scale
- Edit again and ensure saving accurately updates the criterion,
creates a new rubric, and does not modify the account rubric
- Repeat the steps above with the FF disabled and
ensure the existing behavior hasn't changed
(a new course rubric should be created when saving,
but it should have the same contents as the account rubric)
Change-Id: Ib49168aac13f3a370bea69d719de8783500b0e97
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/254290
Reviewed-by: Augusto Callejas <acallejas@instructure.com>
Reviewed-by: Michael Brewer-Davis <mbd@instructure.com>
QA-Review: Augusto Callejas <acallejas@instructure.com>
Product-Review: Jody Sailor
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
refs LS-1687
flag=none
test plan:
- with the new math handling flag off
- editing quizzes is copacetic
Change-Id: Iaafe092f73c723b1b741f2d0dfa2b8beb0d78090
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/255053
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Nate Armstrong <narmstrong@instructure.com>
QA-Review: Nate Armstrong <narmstrong@instructure.com>
Product-Review: Nate Armstrong <narmstrong@instructure.com>
Create page to display when there are no outcomes associated
with the account/course. This page will render until outcomes
have been associated. For now, this page will always render
until APIs are ready. Used instUI Billboard heading which is
slightly different than mocks.
closes OUT-4012
flag=improved_outcomes_management
test plan:
- Go to Account > Settings > Feature Options
- Enable Improved Outcomes Management FF
- With Improved Outcomes Management FF Enabled
- Go to Account > Outcomes
- You should see the no outcomes billboard that displays the
correct context in the message
- Go to Course > Outcomes
- You should see the no outcomes billboard that displays the
correct context in the message
- Disable Improved Outcomes Management FF
- Without Improved Outcomes Management FF disabled
- Go to Account > Outcomes
- You should see previous Outcomes manager
- Go to Course > Outcomes
- You should see previous Outcomes manager
Change-Id: Iacacb200efec4ca5ea485ad87512e3b61f5a3f07
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/253748
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Product-Review: Jody Sailor
Reviewed-by: Pat Renner <prenner@instructure.com>
Reviewed-by: Michael Brewer-Davis <mbd@instructure.com>
QA-Review: Pablo Gomez <pablo.gomez@instructure.com>
closes LS-1678
flag=none
test plan:
- find a quiz with bad math equations in the answers
- request /courses/:id/quizzes/:id/edit?fixup_quiz_math_questions=1
> expect the quiz answers to look better
Change-Id: I2fc0d0beaadab3343df4b10a5f7413bd1b907e7d
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/254429
Reviewed-by: Rob Orton <rob@instructure.com>
QA-Review: Rob Orton <rob@instructure.com>
Product-Review: Rob Orton <rob@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
closes LS-1640
flag=new_math_equation_handling
Currently, the mutation observer watches the body and if any of
the mutationList items' addedNodes list is an element and is not
in the ignore list, then we ask MathJax to reprocess the whole body.
The call back into mathml is debounced to keep from doing this too
often when chunks of the page are updating.
With this change, we walk the addedNodes nodes list and fire the
process-new-math event for each one then mathml then looks
to see if there's actually any math in the element, and if it's
not ignored, then has mathjax process it. This speeds up prcessing
because MathJax doesn't have to look at the whole doc and potentially
reprocess a bunch of math it's already processed and hasn't changed,
it's not having to wait for the debounce timeout.
test plan:
- open pages with math
> expect it to be typeset by mathjax
> expect it to see the typeset math faster than before
(good luck with that)
Change-Id: I9d3f8a0ab1efb6ed88c34d1bcf6caeec215ba140
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/253512
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Charley Kline <ckline@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
closes LS-1639
flag=none
The change to SyllabusBehaviors puts code back that existed
before the new math handling was introduced, and should have
been behind its flag.
If the new_math_equation_handling flag is on, turn it off if
we're editing a quiz, and skip having the backend inject the
hidden mathml, which is part of the legacy equation handling.
test plan:
=== With the new math_equation_handling flag off ====
- create a quiz, add a multiple choice question with an equation
as an answer
- save the question, save the quiz
- edit the quiz, edit the question, do not edit the answer
> look at the DOM. there should be no moe than 1
<span class="hidden-readable">
just after the equation image in the answer
- save the question, save the quiz
- preview the quiz
> expect just 1 <span class="hidden-readable"> in the DOM just
after the equation image
- no combination of edit, save, edit, ... should cause > 1
<span class="hidden-readable">
after the equation image
=== with the new_math_equation_handling flag on ===
- preview and edit the quiz you created with the flag off
> expect it to look A-OK
- create a new quiz
- use the rce's equation editor to put an equation everywhere
you can possible think of in a quiz
- the text
- answers
- comments on the answers
> expect the equations to look right no matter what
- edit the quiz and all the places where there are equations
> yep still ok
- save the qustion
> still ok
- save the quiz
> still ok
- preview the quiz, to completion to see answer comments
> looks good _and_ equations are mathjaxified
- edit everything again
> still looks good everywhere
Change-Id: I1319d007509f6e8cbc9c9af81e3939e365b0fa92
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/253507
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
Hide the student view button when editing syllabus and bulk editing
assignment due dates. In these locations, the same controller#action
is being used for view and edit, so the button is not removed by
the ApplicationController when switching to edit mode. Thus, we use
javascript to hide/show the button when switching.
closes LS-1646
flag = easy_student_view
Test plan:
- Go to syllabus with flag on
- Expect to see student view button
- Click edit
- Expect student view button to be hidden now
- Click cancel or save
- Button should again be visible
- Go to assignments index with flag on (and bulk assignment edit
flag on)
- Expect to see student view button
- Click bulk edit
- Expect button to be hidden
- Click cancel or save
- Button should again be visible
- Functionality of syllabus edit and assignment bulk edit should
still work with easy_student_view flag off
Change-Id: I2e9319b339a1fad2a817c47b20e80c1ed17051d1
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/253753
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Ed Schiebel <eschiebel@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
closes LS-1590
flag=none
Ideally we'd strip the alt text in the canvas backend, but that turns
out to be very inconvenient. Instead, add a hidden=1 param to the URL
and strip the alt text in the browser. This leaks the alt text to the
crafty user, but is better than nothing.
If this isn't good enough, I can work through the issues for getting
it handled in the backend.
test plan:
- as a teacher, unpublish an image file that's embedded in
RCE content.
- as a student, view the page
> expect the lock image in place of the real deal
> expect the alt text to say the image is unavailable
- as a teacher, schedule student availability so it is not
currently available
> as a student, view the page
> expect the lock image
> expect the alt text to say the image is unavailable
Change-Id: Id4c7009bb21cc18227d50ce514159fbf73ff2d15
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/253568
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jeremy Stanley <jeremy@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
closes LS-1628
flag=new_math_equation_handling
This change fixes the issue found in LS-1627:
The previous code assumed image equations have the data-equation-content
attribute. There are legacy images where that's not true. We now
look at the image's src if it's missing, which is what defines the image
anyway. (I'd look solely at the src, but that makes testing difficult)
While working on 1627, I discovered and fixed a couple bugs around
slowly loading images. First, the `else` was in the wrong place
if the image was not currently loaded, so it would never attach
the onload handler. Also, img.complete is not adequaate to determine
if an image is loaded. This is a bit tricky since complete can be set
if the image isn't really loaded and you can't simply add an onload
handler to all images since it doesn't fire if the image has
already loaded. What does work is to check img.naturalWidth since the
browser can't know that until it's really processed the image data.
test plan:
- in the rce, create 3 images with the equation LaTeX in the
data-equation-content, title, or encodeURIComponent twice
in the src, like this:
<p>data-equation-content:</p>
<p>
<img class="equation_image"
title="y = mx + b"
src="http://localhost:3000/equation_images/y%2520%253D%2520mx%2520%252B%2520b"
alt="LaTeX: y = mx + b"
data-equation-content="y = mx + b"
/>
</p>
<p>title:</p>
<p>
<img class="equation_image"
title="y = mx + b"
src="http://localhost:3000/equation_images/y%2520%253D%2520mx%2520%252B%2520b"
alt="LaTeX: y = mx + b"
/>
</p>
<p>src:</p>
<p>
<img class="equation_image"
src="http://localhost:3000/equation_images/y%2520%253D%2520mx%2520%252B%2520b"
alt="LaTeX: y = mx + b"
/>
</p>
<p>slow:</p>
<p>
<img class="equation_image"
src="https://deelay.me/2000/https://latex.codecogs.com/gif.latex?y%20%3D%20x%5E2"
alt="LaTeX: y = x^2"
data-equation-content="y = x^2"
/>
</p>
- save
- with the new_math_equation_handling (Updated math equation handling)
flag off
- refresh the page
> expect the equation images
- with the flag on
- clear your cache (or the delayed image won't be delayed) and refresh the page
> expect all 4 equations to be replaced with the MathJax version
Change-Id: Ic46ae98447fa5c0eff2d1758ecde7f2fbdaeabe2
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/253181
Reviewed-by: Jeremy Stanley <jeremy@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
because if they're created in course context, the
autocreate_attachment_from_media_object feature will put them
in course files, where they can be visible to people who shouldn't
see them (especially after a course copy)
test plan:
- enable the autocreate_attachment_from_media_object feature
- in speedgrader, comment on a student's submission via video
- the student and other teachers that can see the submission
should be able to see the video
- the video should _not_ be attached to course files
- repeat these tests on the submission details page
(/courses/X/assignments/Y/submissions/Z)
closes LS-1589
Change-Id: I80f08a71fce9fe6c29a6f36546a1011d72972588
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/251694
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Jeremy Stanley <jeremy@instructure.com>
closes LS-1613
flag=none
When mathml.js is processing math equation images, only process
those that have loaded. If we don't, the DOM updates out from under
MathJax and the typeset equation can be lost.
Images that are not "complete" get an onload handler that dispatches
the "process-new-math" event to trigger a new cycle of processing.
test plan:
- in a discussion, add a reply
- use the equation editor to add an equation
- in the RCE, also add an inline LaTeX equation
- clear your cache and/or disable cache from devtools newtork tab
- save
> expect both equations to be typeset by MathJax
- edit the reply, clear cache, and click done
> expect both equations again.
Change-Id: I85db3a5b92926c21da9df1a2647596352ef5a73d
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/252593
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jeremy Stanley <jeremy@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
refs INTEROP-6272
flag=none
This code change does not fix the bug, but
it will make any future issues easier to
detect (and troubleshoot) by users
Test Plan:
- Install an LTI tool that does deep linking,
but does not return resource links (like the
YouTube LTI 1.1 tool)
- Attempt to embed a video as a module item
- Verify a clear error message is presented
Change-Id: I94f292fd2bf50aa4bdc451e000f78b0191943f71
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/251891
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Xander Moffatt <xmoffatt@instructure.com>
Product-Review: Weston Dransfield <wdransfield@instructure.com>
Reviewed-by: Wagner Goncalves <wagner.goncalves@instructure.com>
Reviewed-by: Xander Moffatt <xmoffatt@instructure.com>
closes LS-1601
flag=new_math_equation_handling
The previous approach was to replace the equation image with the
equation's LaTeX in canvas' backend, but not all user content sent
to the browser passes through UserContent.escape. Discussions
and legacy quiz questions included. The backend approach also suffered
from the an ugly visual where the LaTeX is displayed onscreen until
MathJax typesets it.
In a previous commit, I caught Discussion replies in apiUserContent
where the screenreader assistive mathml is injected into the DOM
adjacent to the image. That worked but we now had 2 places
where the replacement was taking place, and quiz questions are
still being missed.
A better approach is to handle it all in a central location, which
is with the code that detects math is on the page. The new approach
is to inject the LaTeX into the DOM adjacent to the image just before
MathJax does its processing, then removes the image when it finished.
This way the equation image is displayed to the user while MathJaX
does its work, and since we look for new math in a MutationObserver
watching the whole document, we never miss any equation images on the page.
Because we are looking for mutations anywhere on the page, there may
be nodes we want to ignore (e.g. the quiz timer). This is handled
by adding to the ignore_list css selector in main.js
test plan:
- with the "Updated math equation handling" flag on
(and optionally 'Support LaTex math equations almost everywhere")
- double check that equations created with the rce equation editor
are processed with mathjax all over canvas
> expect equation images to be visible until replaced by MathJax
typeset versions
- Discussions:
- reply to a discussion with an equation (inline and equation editor)
> expect them to be typeset by mathjax
- edit a reply and save
> expect the the reply to have it's math processed by mathjax
- Legacy Quizzes
- create a quiz, set it so 1 question per page
- add a couple questions with equations
- preview the quiz, moving forward and back thru the questions
> expect the questions go have their equations typeset by mathjax
Change-Id: I9e2ec4fd53de06748156bbd4adadac7e2b1e205f
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/252222
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
test plan:
- as a student, submit a file to an assignment
- as a teacher, view the submission in speedgrader, and note
that there is a download link (down arrow) to the right of
the file, but there is not a trash icon
- as an account administrator, view the submission. you
should see a trash icon.
- click it, and you'll get a confirmation prompt.
- confirm, and the page reloads with the submission
replaced by "file_removed.pdf"
closes LS-1583
Change-Id: I571cb3666d48936840df14d6bbfd8587ea52a873
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/251411
QA-Review: Robin Kuss <rkuss@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
If the LaTex the user enters in the equation editor has errors, MathJax
will typeset it, with the bad parts in red. Unfortunately the services
canvas uses to generate the equation image simply fails. The equation
editor now verifies the image can be created and if not will insert the
raw LaTex into the RCE delimited by \(...\). Once the content is saved
MathJax will typeset it (includeing the red, but hey, that's better than
what used to happen)
now also looks for $$ or \( starting delimiters mathjax uses to identify
equations
and looks for math on wiki pages, which was missing before (that is not
behind the flag)
had to skip some screenreader_gradebook specs that started failing with
this change. The gradebook team is as baffled as I am. Will create a ticket
for them to investigate.
if MathJax finds LaTex errors (which you'll see as red in the preview
area), the service that generates the equation image fails. This detects
that and puts the raw LaTex back into the RCE's content. On saving,
MathJax can process it, though it will still show the red. This is better
than losing the equation because the image was missing.
closes LS-1488
flag=inline_math_everywhere
test plan
- turn on SiteAdmin feature "Updated math equation handling"
(flag=new_math_equation_handling) and the RootAccount feature
"Support LaTex math equations almost everywhere"
(flag=inline_math_everywhere)
- in the RCE, enter \(your equation here\)
- enter $$your equation here$$
- save
> expect the first equation to be typeset and inline
> expect the second equation to be typeset and a block
- the equations can span over multiple lines too
- edit an eq with the eq editor
> expect it to be typeset too
- in the RCE, click on the equation toolbar button
- in the advanced pane of the equation editor, enter \var = 27
> expect the \var part to be red in the preview area (because it's undefined)
- click Insert Equation
> expect the raw LaTex to be in the RCE, surrounded by \(...\)
- save
> expect the equation to get typeset (and \var will still be red)
- have an assignment with a student submission
- grade the submission and include an equation in the comments
for example: \(y = x^2\)
- view the assignment as the student
> expect the equation to be renered in the comment
- put some math in an assignment's title
> expect to see it in the assignments page
- create a page with inline math in it
> expect the math to be typeset
- create a discussion with math
- reply to the discussion with inline math in it
- edit the inline math in the reply
> expect the math to be typeset each time
- have a student submit an assignemnt
- as the teacher go to speedgrader
> expect typeset math everywhere
- add a comment with math
> expect it to get typeset
- goto /assignments/:id/submissions/:user_id
> expect comments with math to be typeset
- add a comment with math
> epxect it go get typeset
- old quizzes mostly work. While editing a question, inline
math is not typeset, but will be when previewed or being
taken
- new module items can have math in their title, but if you edit
the title, it's not typeset (this has baffled me, but it's
got to be an extremely rare case)
- load the calendar and flipping between Week, Month, Agenda views
> expect entries with inline math in their title to be typeset
- loading the calendar in different initial Views
> expect events with math to be typeset
- create a new event with math (e.g. "\(x^2\)"in the title
> expect it to be typeset when saved
- open an event in the calendar that has math in its body
> expect it to be typeset
Change-Id: Id5e1e822fad29a52bf21573e62976a4482afcf43
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/248246
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Charley Kline <ckline@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
When LTI tool response for the attribute `content_items` is empty
we need to close all dialogs/modals to provide a better expirence to
the user.
Requesting partner: McGraw Hill
closes INTEROP-6263
flag=none
test-plan:
* Have an LTI tool installed in your local Canvas.
* At the Module level click at + to add a new item, then select
`External tool` from the dropdown menu, then select the LTI tool test.
* At the dialog to link the resource from an external tool check the
option `Without Content Items` and then click at the submit button.
* Ensures that Canvas will close all dialogs/modals and no content item
should be created, you can refresh the page to validate it too.
Change-Id: Id70813f5d214652b38dee209303345f801be542a
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/250866
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Xander Moffatt <xmoffatt@instructure.com>
QA-Review: Xander Moffatt <xmoffatt@instructure.com>
Product-Review: Karl Lloyd <karl@instructure.com>
refs QO-679
flag=none
test plan:
- if the timer would have displayed "NaN Minutes, NaN Seconds"
now it should display:
Your browser connectivity may be slow or unstable. In spite of your
browser’s timer being disconnected, your answers will be recorded
for an additional 5 minutes beyond the original time limit on this
attempt.
- the message should be in red (meeting 4.5:1 contrast)
- when the message is displayed, the timer label "Time Running"
should be hidden, as should the "hide" button
- when there are no issues with the timer, it should display
the time as before
Change-Id: I6144da5b6c7f23cee44772caca636be630f90a25
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/250987
Reviewed-by: Alex Slaughter <aslaughter@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Mark McDermott <mmcdermott@instructure.com>
Product-Review: Matthew Goodwin <mattg@instructure.com>
Test Plan:
- Create old quiz with a timelimit of 2 minutes
- Disable javascript and wait 5 minutes
- Re-enable javascript and page should reload
Refs: QO-679
flag = none
Change-Id: I002a5d67e8e69cd46ef8269fed11c5898f04288c
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/250728
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Stephen Kacsmark <skacsmark@instructure.com>
QA-Review: Mark McDermott <mmcdermott@instructure.com>
Product-Review: Matthew Goodwin <mattg@instructure.com>
refs QO-679
flag=none
test plan:
- if the timer would have displayed "NaN Minutes, NaN Seconds"
now it should display a helpful message instead
- the message should be in red (meeting 4.5:1 contrast)
- when the message is displayed, the timer label "Time Running"
should be hidden, as should the "hide" button
- when there are no issues with the timer, it should display
as before
Change-Id: I33938b0b5208d9d37aab57379459f076086983d3
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/250884
Reviewed-by: Stephen Kacsmark <skacsmark@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Mark McDermott <mmcdermott@instructure.com>
Product-Review: Matthew Goodwin <mattg@instructure.com>
closes: LS-1265
flag=new_file_url_rewriting
- gets the verifier, if present, forwarded to the view ping-back url so
it doesn't fail for one student viewing another student's file.
Before, when clicking on a pdf and viewing it in canvadoc, the viewing
of the file succeeds, but the ping-back to log access failed with a
401. The pin-back now succeeds.
- gets the verifier added to the iframe src when the file is being
viewed locally. You'll see this when stud1 views a .txt file from
stud2. plain text files aren't canvadoc viewable, so they're viewed
in a vanilla iframe.
- stops rewriting non /preview file URLs as download URLs.
?download_frd=1 is reserved for that. Before, when users deleted
"/download" from a link using the html editor in the RCE, canvas put
it back. Not any more. They shouldn't need to any more either,
since...
- updates the RCE so links to files do not include /download in the
URL. When clicking on a file in the tray, the generated link no
longer incfludes /download (and canvas won't put it back)
Embedded images use /preview. Using Image Options to convert the
image to a link removes /preview, and no longer replaces it with
/download.
There's some weird file URL handling on in canvas. If the URL is
/preview, it is not logged as a page view. There are comments
indicating that /download will log access, though not always actually
download the file, and that download_frd is used for that. I found
this to be hard to confirm, /download seemed to download for me a lot.
It might be in conjunction with wrap=1, or the inline class name?
This change sets the "new_file_url_rewriting" flag, which enables
these changes, to on in ci and dev.
This change scares me a little and I really want to know that it's
OK.
test plan:
- ensure new_file_url_rewriting is enabled (which it should be
unless you're in test or prod environments)
- it's nice to have canvadocs enabled too
- do everything you can think of that revolves around files/attachments
and make sure it still works. No, I/m not kidding
- link files in the RCE
- try file types that will be viewed in canvadocs (.pdf),
in google docs (.rtf), and in a vanilla iffame (.txt)
> expect clicking on the link to open the file in another tab
- embed images in the RCE
> expect the image to be shown, and the <img src> to be /preview
(when loading a page with an image, the image will not show up in
recent history, if that's on)
- using Image Options, convert an image to a link
> expect the image to be displayed in a new tab when the link is
clicked
- link a file, then add "download_frd=1" to the href's query_string
> click on the link and expect the file to be downloaded, not viewed
- I think there's some special handling WRT student submissions, so
try all ^that in a submission.
- All ^that should work if student1 links/embeds user files and
images, then
> expect student2 to be able to view them all
(discussions are good for this)
> expect existing content behavior to be the same as it ever was.
Change-Id: Ieae7e4daf549ececb982007b6ce97c8c091c099c
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/249094
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
closes LS-1559
flag=none
This makes 2 changes
1. Before, we were loading both Lato and Lato Extended fonts. This is
unnecessary since Lato Extended is a superset of Lato. Now the default
behavior is to load Lato Extended and not Lato.
2. Because we host the Lato Extended font files, Cisco loads them from
their own instance of Akamai, and they want to avoid the extra bandwidth
it's using. If Setting.get('disable_lato_extended', 'false') returns
anything other than 'false', canvas will load Lato instead.
I don't know why it started failing, but I had to tweak an rce
selenium test.
test plan:
- load any canvas page
> using the devtools Network panel, Font filter, expect to see
our lato extended woff2 files downloaded from /fonts/lato/extended
and not from fonts.gstatic.com/s/lato
> expect the pages to look just like they did before
- load a page with the RCE
> expect the lato extended woff2 files loaded in the tinymce iframe
also.
- add some text including bold, italic and so forth
> expect the text to look like it did before (see also LS-317)
- in a rails console, run Setting.set('disable_lato_extended', 'true')
- send a SIGHUP to the canvas server process
(hint: `ps -a | grep puma` will find the process for `kill -1 <pid>`
- load any canvas page
> expect to see fonts from fonts.gstatic.com/s/lato, and not our
files from /fonts/lato/extended
- load a page with the RCE
> expect to see the fonts loaded in the tinymce iframe
from fonts.gstatic.com, and not ours from /fonts/lato/extended
Change-Id: Ia0c1820fa96e5ae9095c3cd1c796d381fd035a8a
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/250533
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>
closes OUT-3933
flag=usage_rights_discussion_topics
If a course has usage rights enabled, then discussion topics
and announcements will require usage rights set when a file
is attached. If a user does not have manage_files permission,
then the usage rights indicator won't appear, but a default
set of usage rights will be applied.
prerequisites:
- in a course, enable copyright and license information
on files under course settings
- create teacher and student accounts
- create a group in the course, and add the student
test plan (before enabling feature flag):
- confirm that when creating course discussions
with file attachments as a teacher and student,
copyright information is not set in the Files
section
- confirm the same with group discussions
test plan:
- enable the feature flag
- for course discussions:
- as a teacher, confirm that when creating
a discussion, usage rights are required
when attaching a file. confirm the usage
rights settings are set by re-editing
the discussion and viewing the file in
the Files section
- as a student, confirm that when creating
a discussion, usage rights are not required
when attaching a file, but when the topic
is created, the file appears with a copyright
setting in the Files section
- for group discussions:
- as a teacher and student, confirm that when
creating a discussion, usage rights are
required when attaching a file. confirm the usage
rights settings are set by re-editing
the discussion and viewing the file in
the Files section
Change-Id: I0dc6532f7d8188cf4f623275fcf8562f19585f1f
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/248211
Reviewed-by: Pat Renner <prenner@instructure.com>
QA-Review: Pat Renner <prenner@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Product-Review: Jody Sailor
This removes the 'Embed' tab from the record/upload media modal and
places it in its own modal that can be launched from a separate
button in the menubar and toolbar.
fixes LS-1387
flag=rce_enhancements
Test plan:
- Open the new RCE
- Click record/upload media button
- Expect modal to have 2 tabs - computer and record
- Expect functionality to still work
- Click Embed button and paste in an embed code
- Expect embedding to work
- Expect submit button to work only if text is present
Change-Id: I06f1fe81016f438c6dbf611aacf1250bfa214c7b
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/248951
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Ed Schiebel <eschiebel@instructure.com>
QA-Review: Ed Schiebel <eschiebel@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
Previously, direct share options appeared for users with
manage_content rights in a course. This commit gives direct share
options to users with read_as_admin rights, so any teacher, TA, or
designer can direct share content, even if a course is concluded.
flag=none
Fixes LS-1409
Test plan:
- Enable direct share on account
- Create a term where "teachers can access from" is concluded
- Add course to term
- In the course as a teacher, verify that direct share functions from
the following locations (also expect most other menu items to not
appear):
- Modules
- Pages index and individual
- Discussions index and individual
- Assignments index and individual
- Quizzes index and individual
- Modify term settings to allow access for teachers always
- Teachers should now see all menu items in above locations, including
direct share
- Students should never see any menu options in above locations
- Disable direct share for account; verify that menus in above
locations still function
Change-Id: I53b09ed0c535079ab4e811d58de18ab1ef7f6d3a
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/248214
Reviewed-by: Jeremy Stanley <jeremy@instructure.com>
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Jackson Howe <jackson.howe@instructure.com>
closes: LS-1493
flag=rce_enhancements
test plan:
- in the RCE, click View > Fullscreen from the menus
> expect the RCE to fill the browser window
> expect View > Raw HTML Editor to be disabled
- click View > Fullscreen again (or type ESC, or type cmd-shift-F)
> expect the RCE to return
- click View > Raw HTML editor
> expect the rce to flip to html edit mode
- click the </> button in the RCE's status bar
> expect the rce to flip back to wysiwyg mode
- click in the fullscreen button in the RCE's status bar
> expect the rce to go fullscreen
- from the rce content area, TAB into the status bar
- use the arrow keys to move focus thru the buttons
> expect it to wrap around when you get to either end
- with focus on any of the buttons, type TAB to leave the RCE
- type shift TAB to return
> expect focus to return to the same button that had focus before
- click the html view button
- type TAB to move focus to the status bar
> expect focus to return to the </> button
- use the arrow keys to move between the remaining 2 buttons
> expect focus to wrap around
- TAB out and bck in
> expect focus to return to the button that had focus before
Change-Id: Ib327c0e19b56ed6461f0ce74d01cccd1c8c1f340
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/248171
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jeremy Stanley <jeremy@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Peyton Craighill <pcraighill@instructure.com>
closes LS-1492
flag=none
test plan:
- turn the use_updated_math_rendering flag on
- change your user's language
- open an item with a math equation
- right-click in the equation
> expect the menu to be in the language
Change-Id: I786dfcf232827c2600da7be8a5ec84c19a6901a6
Reviewed-on: https://gerrit.instructure.com/c/canvas-lms/+/248175
Tested-by: Service Cloud Jenkins <svc.cloudjenkins@instructure.com>
Reviewed-by: Jackson Howe <jackson.howe@instructure.com>
QA-Review: Robin Kuss <rkuss@instructure.com>
Product-Review: Ed Schiebel <eschiebel@instructure.com>